Apr 17 20:15:30.224090 ip-10-0-139-2 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:15:30.626027 ip-10-0-139-2 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:30.626027 ip-10-0-139-2 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:15:30.626027 ip-10-0-139-2 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:30.626027 ip-10-0-139-2 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:15:30.626027 ip-10-0-139-2 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:30.627632 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.627539 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:15:30.630676 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630656 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:30.630676 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630674 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:30.630676 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630679 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630684 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630688 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630692 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630696 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630700 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630704 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630709 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630712 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630718 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630722 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630726 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630730 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630734 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630738 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630761 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630765 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630770 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630774 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630778 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:30.630886 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630781 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630785 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630790 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630794 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630798 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630802 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630806 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630810 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630817 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630821 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630825 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630829 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630834 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630838 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630843 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630847 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630851 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630857 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630862 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630867 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:30.631634 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630871 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630874 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630879 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630882 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630886 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630890 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630894 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630899 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630903 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630907 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630910 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630914 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630918 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630923 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630927 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630931 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630938 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630945 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630949 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630953 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:30.632453 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630957 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630961 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630966 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630970 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630974 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630978 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630983 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630987 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630991 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.630995 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631001 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631005 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631014 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631020 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631025 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631030 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631035 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631039 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631043 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:30.632985 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631047 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631052 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631056 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631060 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.631064 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632927 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632948 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632952 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632960 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632965 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632970 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632974 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632980 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632984 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632988 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632994 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.632998 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633001 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633005 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:30.633597 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633007 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633017 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633022 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633027 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633031 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633035 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633040 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633044 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633048 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633052 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633056 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633061 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633065 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633069 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633077 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633080 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633083 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633086 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633089 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633123 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:30.634098 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633657 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633666 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633670 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633674 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633677 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633680 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633683 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633687 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633694 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633697 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633700 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633703 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633705 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633708 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633710 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633713 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633716 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633719 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633721 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633724 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:30.634591 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633726 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633729 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633731 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633734 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633736 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633740 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633755 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633758 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633761 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633763 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633766 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633768 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633772 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633775 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633778 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633781 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633783 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633786 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633788 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633791 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:30.635138 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633794 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633797 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633799 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633802 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633805 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633807 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633810 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633812 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633814 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633818 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633820 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.633824 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633902 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633911 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633918 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633922 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633927 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633931 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633935 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633940 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633943 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:15:30.635624 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633946 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633950 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633953 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633956 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633960 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633962 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633965 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633968 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633971 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633974 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633979 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633981 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633985 2579 flags.go:64] FLAG: --config-dir="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633987 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633991 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633995 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.633998 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634001 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634004 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634007 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634010 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634013 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634017 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634020 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634024 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:15:30.636161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634027 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634030 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634033 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634036 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634039 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634044 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634047 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634050 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634053 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634056 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634060 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634063 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634066 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634069 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634072 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634075 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634078 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634080 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634084 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634087 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634090 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634098 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634101 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634104 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634115 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634118 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:15:30.636769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634121 2579 flags.go:64] FLAG: --help="false" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634124 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-139-2.ec2.internal" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634127 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634130 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634133 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634137 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634140 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634143 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634146 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634149 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634152 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634154 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634157 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634160 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634163 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634166 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634169 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634171 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634174 2579 flags.go:64] FLAG: --lock-file="" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634177 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634180 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634183 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634188 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:15:30.637402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634191 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634194 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634197 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634200 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634203 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634205 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634208 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634213 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634217 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634222 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634225 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634228 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634231 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634234 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634236 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634239 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634242 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634250 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634253 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634256 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634259 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634262 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634267 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634270 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:15:30.637970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634273 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634277 2579 flags.go:64] FLAG: --port="10250" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634280 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634282 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09c2ef73502099b44" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634285 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634288 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634291 2579 flags.go:64] FLAG: --register-node="true" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634294 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634296 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634300 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634303 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634306 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634308 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634312 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634315 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634318 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634321 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634325 2579 flags.go:64] FLAG: --runonce="false" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634328 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634331 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634333 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634336 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634339 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634342 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634345 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634348 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:15:30.638550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634351 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634353 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634356 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634359 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634362 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634365 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634367 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634374 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634377 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634379 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634385 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634388 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634391 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634393 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634412 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634419 2579 flags.go:64] FLAG: --v="2" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634424 2579 flags.go:64] FLAG: --version="false" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634428 2579 flags.go:64] FLAG: --vmodule="" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634432 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634436 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634535 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634539 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634542 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634545 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:30.639185 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634548 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634551 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634554 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634558 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634562 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634565 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634568 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634571 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634573 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634576 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634578 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634581 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634583 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634586 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634588 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634591 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634594 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634596 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634599 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634601 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:30.639755 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634604 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634607 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634609 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634614 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634616 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634619 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634621 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634624 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634626 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634629 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634631 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634634 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634636 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634639 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634641 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634644 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634646 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634649 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634652 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634654 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:30.640291 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634657 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634660 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634662 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634665 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634667 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634669 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634672 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634674 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634676 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634679 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634681 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634684 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634686 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634688 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634691 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634694 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634697 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634699 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634702 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634704 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:30.640828 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634707 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634709 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634712 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634714 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634716 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634719 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634723 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634727 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634730 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634733 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634735 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634738 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634760 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634763 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634766 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634772 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634774 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634777 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634780 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:30.641335 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634782 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:30.641823 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634785 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:30.641823 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.634787 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:30.641823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.634793 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:30.641912 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.641836 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:15:30.641912 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.641856 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:15:30.641912 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641910 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641916 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641920 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641923 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641926 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641929 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641931 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641934 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641937 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641940 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641942 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641945 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641947 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641950 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641952 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641955 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641958 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641960 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641963 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641965 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:30.641989 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641968 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641970 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641973 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641976 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641978 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641981 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641984 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641986 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641989 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641991 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641995 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.641998 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642001 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642004 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642006 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642009 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642011 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642014 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642016 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642020 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:30.642475 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642024 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642026 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642029 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642031 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642034 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642036 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642039 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642041 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642043 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642046 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642048 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642051 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642053 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642056 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642059 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642062 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642064 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642067 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642069 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642072 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:30.642981 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642075 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642077 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642080 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642082 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642086 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642088 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642091 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642093 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642095 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642098 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642101 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642103 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642105 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642108 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642112 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642117 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642120 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642123 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642126 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:30.643455 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642129 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642131 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642134 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642137 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642139 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642141 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642145 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.642151 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642251 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642255 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642258 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642261 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642264 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642266 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642269 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642271 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:30.643979 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642274 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642277 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642280 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642283 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642285 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642288 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642290 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642293 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642295 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642298 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642300 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642302 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642305 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642307 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642310 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642312 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642315 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642317 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642320 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642322 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:30.644383 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642324 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642327 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642329 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642333 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642336 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642338 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642341 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642344 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642346 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642348 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642351 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642353 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642356 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642359 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642361 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642368 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642371 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642374 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642376 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642378 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:30.644882 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642381 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642384 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642386 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642389 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642391 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642394 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642396 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642399 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642402 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642405 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642407 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642411 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642414 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642417 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642420 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642423 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642426 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642429 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642432 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:30.645363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642434 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642437 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642439 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642442 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642444 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642447 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642449 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642452 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642454 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642458 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642461 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642464 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642466 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642469 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642471 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642473 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642476 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642478 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:30.645843 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:30.642481 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:30.646310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.642486 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:30.646310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.643214 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:15:30.646310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.646130 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:15:30.646981 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.646969 2579 server.go:1019] "Starting client certificate rotation" Apr 17 20:15:30.647082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.647067 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:30.647119 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.647106 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:30.671185 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.671156 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:30.675123 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.675103 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:30.691598 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.691575 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:15:30.697368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.697351 2579 log.go:25] "Validated CRI v1 image API" Apr 17 20:15:30.698788 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.698768 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:15:30.700252 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.700236 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:30.702895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.702872 2579 fs.go:135] Filesystem UUIDs: map[71a9feb7-bab9-47b3-991a-b6e7d87f0d98:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 bec43421-aa14-4b80-8913-42efff67c572:/dev/nvme0n1p4] Apr 17 20:15:30.702972 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.702892 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:15:30.708590 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.708471 2579 manager.go:217] Machine: {Timestamp:2026-04-17 20:15:30.706598153 +0000 UTC m=+0.371523834 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096082 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26e7aba733cf6130c2d987d08004ef SystemUUID:ec26e7ab-a733-cf61-30c2-d987d08004ef BootID:b23101f7-b3fe-4079-b003-8274919ddce2 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3c:3c:5c:47:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3c:3c:5c:47:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:fa:83:dc:e4:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:15:30.708590 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.708584 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:15:30.708723 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.708711 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:15:30.711035 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.711010 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:15:30.711184 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.711038 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-2.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:15:30.711230 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.711193 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:15:30.711230 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.711202 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:15:30.711230 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.711215 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:30.712135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.712124 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:30.713731 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.713719 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:30.713857 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.713848 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:15:30.715971 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.715960 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:15:30.716006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.715976 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:15:30.716006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.715988 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:15:30.716006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.715997 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:15:30.716006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.716007 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:15:30.717123 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.717109 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:30.717182 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.717130 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:30.717511 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.717493 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9fppn" Apr 17 20:15:30.720174 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.720156 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:15:30.721615 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.721600 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:15:30.723433 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723420 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723438 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723447 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723455 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723464 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723475 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723483 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:15:30.723490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723489 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:15:30.723680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723497 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:15:30.723680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723502 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:15:30.723680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723516 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:15:30.723680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.723525 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:15:30.724077 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.724060 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9fppn" Apr 17 20:15:30.724284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.724270 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:15:30.724322 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.724289 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:15:30.728570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.728413 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:15:30.728671 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.728591 2579 server.go:1295] "Started kubelet" Apr 17 20:15:30.728737 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.728682 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:15:30.728999 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.728960 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:15:30.729053 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.729011 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:15:30.729723 ip-10-0-139-2 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:15:30.732148 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.732090 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:15:30.732483 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.732456 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:30.733586 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.733568 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:15:30.736307 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.736286 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:30.737356 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.737323 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-2.ec2.internal" not found Apr 17 20:15:30.738640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.738621 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:15:30.739009 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.738676 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:30.739655 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.739636 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:15:30.739732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.739660 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:15:30.739732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.739640 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:15:30.739862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.739792 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:15:30.739862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.739799 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:15:30.740001 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.739977 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-2.ec2.internal\" not found" Apr 17 20:15:30.741047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741026 2579 factory.go:55] Registering systemd factory Apr 17 20:15:30.741129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741060 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:15:30.741129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741117 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:30.741462 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741442 2579 factory.go:153] Registering CRI-O factory Apr 17 20:15:30.741462 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741462 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 20:15:30.741596 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741534 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:15:30.741596 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741555 2579 factory.go:103] Registering Raw factory Apr 17 20:15:30.741596 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741570 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 20:15:30.741998 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.741942 2579 manager.go:319] Starting recovery of all containers Apr 17 20:15:30.742608 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.742587 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:15:30.743726 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.743706 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-2.ec2.internal\" not found" node="ip-10-0-139-2.ec2.internal" Apr 17 20:15:30.751151 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.751124 2579 manager.go:324] Recovery completed Apr 17 20:15:30.752920 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.752894 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-2.ec2.internal" not found Apr 17 20:15:30.755992 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.755978 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:30.758115 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758099 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:30.758186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758127 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:30.758186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758138 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:30.758623 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758604 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:15:30.758623 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758620 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:15:30.758730 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.758637 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:30.761231 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.761213 2579 policy_none.go:49] "None policy: Start" Apr 17 20:15:30.761326 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.761235 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:15:30.761326 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.761249 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:15:30.796499 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.796477 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.796539 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.796555 2579 server.go:85] "Starting device plugin registration server" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.796875 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.796888 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.796981 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.797064 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.797074 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.797608 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.797645 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-2.ec2.internal\" not found" Apr 17 20:15:30.817551 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.811426 2579 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-2.ec2.internal" not found Apr 17 20:15:30.871609 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.871572 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:15:30.872951 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.872924 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:15:30.872951 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.872955 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:15:30.873102 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.872975 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:15:30.873102 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.872981 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:15:30.873102 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:30.873011 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:15:30.876189 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.876136 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:30.897174 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.897143 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:30.898301 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.898282 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:30.898407 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.898312 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:30.898407 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.898327 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:30.898407 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.898352 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-2.ec2.internal" Apr 17 20:15:30.906241 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.906220 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-2.ec2.internal" Apr 17 20:15:30.973875 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.973830 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal"] Apr 17 20:15:30.978503 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.978477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:30.978646 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:30.978481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.008712 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.008684 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.013344 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.013326 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.023940 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.023920 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:31.026459 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.026439 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:31.041831 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.041799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.041939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.041836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.041939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.041854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a95c6233e62f070e1ff4cac4a1fc713-config\") pod \"kube-apiserver-proxy-ip-10-0-139-2.ec2.internal\" (UID: \"2a95c6233e62f070e1ff4cac4a1fc713\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142487 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142487 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142487 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a95c6233e62f070e1ff4cac4a1fc713-config\") pod \"kube-apiserver-proxy-ip-10-0-139-2.ec2.internal\" (UID: \"2a95c6233e62f070e1ff4cac4a1fc713\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142686 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a95c6233e62f070e1ff4cac4a1fc713-config\") pod \"kube-apiserver-proxy-ip-10-0-139-2.ec2.internal\" (UID: \"2a95c6233e62f070e1ff4cac4a1fc713\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142686 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.142686 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.142509 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a71794485a15783c46c26c3b9e1289ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal\" (UID: \"a71794485a15783c46c26c3b9e1289ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.325567 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.325527 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.329956 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.329929 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" Apr 17 20:15:31.647481 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.647448 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:15:31.648191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.647592 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:31.648191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.647626 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:31.648191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.647632 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:31.716186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.716155 2579 apiserver.go:52] "Watching apiserver" Apr 17 20:15:31.723366 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.723327 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:15:31.724515 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.724491 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-mcn8b","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s","openshift-dns/node-resolver-w9tk6","openshift-image-registry/node-ca-msb4t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal","openshift-cluster-node-tuning-operator/tuned-r4bks","openshift-multus/multus-additional-cni-plugins-p25dr","openshift-multus/multus-r82dk","openshift-multus/network-metrics-daemon-842wl","openshift-network-diagnostics/network-check-target-q6spr","openshift-network-operator/iptables-alerter-8j7zn","openshift-ovn-kubernetes/ovnkube-node-46w54"] Apr 17 20:15:31.726225 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.726194 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:10:30 +0000 UTC" deadline="2027-11-13 06:28:32.03746629 +0000 UTC" Apr 17 20:15:31.726225 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.726220 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13786h13m0.311248262s" Apr 17 20:15:31.726965 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.726950 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.729069 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.729039 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gp4jr\"" Apr 17 20:15:31.729175 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.729071 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.729230 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.729173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.729230 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.729203 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.731631 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.731599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:15:31.731832 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.731614 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.731832 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.731617 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.731832 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.731599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8dfth\"" Apr 17 20:15:31.732004 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.731992 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.734290 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.734273 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7rk\"" Apr 17 20:15:31.734377 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.734319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.734420 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.734403 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.736777 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.736759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.736863 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.736816 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.738810 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.738795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:15:31.738965 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.738948 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:15:31.738965 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.738959 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6njhp\"" Apr 17 20:15:31.739081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.739081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739037 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.739081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739060 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:15:31.739250 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739232 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ct92n\"" Apr 17 20:15:31.739309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739261 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:31.739309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.739265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.741103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741083 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.741223 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.741305 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741283 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:15:31.741581 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741565 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:15:31.741643 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741634 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-w2cwx\"" Apr 17 20:15:31.741702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741667 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.741774 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.741698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:15:31.743002 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.742987 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqhsv\"" Apr 17 20:15:31.743458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.743442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:31.743549 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.743522 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:31.744453 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.744437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:15:31.745496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-host\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.745563 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-kubernetes\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.745563 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-sys\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.745563 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-bin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.745707 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-multus-certs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.745707 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745598 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6jn\" (UniqueName: \"kubernetes.io/projected/e38306e8-30e0-42b4-889f-6950beb72b21-kube-api-access-zz6jn\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.745707 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-k8s-cni-cncf-io\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.745707 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.745707 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745685 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-serviceca\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745710 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cni-binary-copy\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysconfig\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745775 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-os-release\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745878 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-socket-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-daemon-config\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.745964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-tmp\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.745983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6t6\" (UniqueName: \"kubernetes.io/projected/0f04f579-459c-4246-9509-08688b2cebb7-kube-api-access-bz6t6\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746031 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/804b57c1-49b2-4e56-8da1-70a591e070e2-tmp-dir\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-system-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-kubelet\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-hostroot\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npktn\" (UniqueName: \"kubernetes.io/projected/b810562f-1e78-430b-bb52-4ddd48b17312-kube-api-access-npktn\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.746253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-os-release\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746255 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/955d42b9-2b82-4faa-aa56-05c806e38889-konnectivity-ca\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-sys-fs\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746364 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-conf-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/955d42b9-2b82-4faa-aa56-05c806e38889-agent-certs\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-cnibin\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-device-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.746575 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-conf\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-systemd\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-socket-dir-parent\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-netns\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-var-lib-kubelet\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746726 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/804b57c1-49b2-4e56-8da1-70a591e070e2-hosts-file\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-binary-copy\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-run\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7cb\" (UniqueName: \"kubernetes.io/projected/804b57c1-49b2-4e56-8da1-70a591e070e2-kube-api-access-ms7cb\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-lib-modules\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-host\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g66d2\" (UniqueName: \"kubernetes.io/projected/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-kube-api-access-g66d2\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-multus\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-etc-kubernetes\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.746996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-modprobe-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-registration-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.747484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.747484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cnibin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.747484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747137 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-system-cni-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.747484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-etc-tuned\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.747484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpjs\" (UniqueName: \"kubernetes.io/projected/80b2534b-c049-4a65-8cdb-fc90c54d1a82-kube-api-access-6hpjs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.748049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747738 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.748049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747834 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:15:31.748049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.747861 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h88qx\"" Apr 17 20:15:31.748224 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.748106 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:31.748224 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.748109 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.748224 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.748176 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:31.749607 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.749586 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:31.750570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.750553 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.754022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754000 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754077 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l6bsz\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754077 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754140 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:15:31.754165 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.754086 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:15:31.766166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.766146 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8s4b5" Apr 17 20:15:31.773505 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.773484 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8s4b5" Apr 17 20:15:31.836485 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:31.836457 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a95c6233e62f070e1ff4cac4a1fc713.slice/crio-f802c61a75a61db86f4df98232a32a97a274134eac253e22114000e20a8243f2 WatchSource:0}: Error finding container f802c61a75a61db86f4df98232a32a97a274134eac253e22114000e20a8243f2: Status 404 returned error can't find the container with id f802c61a75a61db86f4df98232a32a97a274134eac253e22114000e20a8243f2 Apr 17 20:15:31.837358 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:31.837336 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71794485a15783c46c26c3b9e1289ac.slice/crio-e00ab0affaa74a4334487e6f293b5ea5d1c86f4311680ce5d4d2305e06ee27b7 WatchSource:0}: Error finding container e00ab0affaa74a4334487e6f293b5ea5d1c86f4311680ce5d4d2305e06ee27b7: Status 404 returned error can't find the container with id e00ab0affaa74a4334487e6f293b5ea5d1c86f4311680ce5d4d2305e06ee27b7 Apr 17 20:15:31.840491 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.840473 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:15:31.841863 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.841815 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:15:31.847330 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g66d2\" (UniqueName: \"kubernetes.io/projected/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-kube-api-access-g66d2\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.847382 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-multus\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847432 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-etc-kubernetes\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-var-lib-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.847479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-multus\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847470 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.847606 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-modprobe-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.847606 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-registration-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.847702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.847702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-etc-kubernetes\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cnibin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-registration-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.847702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-kubelet\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-system-cni-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-modprobe-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-system-cni-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cnibin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.847946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-etc-tuned\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpjs\" (UniqueName: \"kubernetes.io/projected/80b2534b-c049-4a65-8cdb-fc90c54d1a82-kube-api-access-6hpjs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.847998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-etc-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-host\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848031 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-systemd-units\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-host\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-script-lib\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-kubernetes\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-sys\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-bin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-multus-certs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848345 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-ovn\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848385 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-bin\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-sys\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6jn\" (UniqueName: \"kubernetes.io/projected/e38306e8-30e0-42b4-889f-6950beb72b21-kube-api-access-zz6jn\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.848465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-kubernetes\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-multus-certs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-k8s-cni-cncf-io\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-cni-bin\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-k8s-cni-cncf-io\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-env-overrides\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-serviceca\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cni-binary-copy\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-netd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysconfig\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-node-log\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.848887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-os-release\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-socket-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848957 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-daemon-config\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysconfig\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.848988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-tmp\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6t6\" (UniqueName: \"kubernetes.io/projected/0f04f579-459c-4246-9509-08688b2cebb7-kube-api-access-bz6t6\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/804b57c1-49b2-4e56-8da1-70a591e070e2-tmp-dir\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-system-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-kubelet\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-hostroot\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5hj\" (UniqueName: \"kubernetes.io/projected/6c2533fa-a070-45de-b5b1-0a51d7a87b22-kube-api-access-5k5hj\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-socket-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npktn\" (UniqueName: \"kubernetes.io/projected/b810562f-1e78-430b-bb52-4ddd48b17312-kube-api-access-npktn\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.849570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849121 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-os-release\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-cni-binary-copy\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftcw\" (UniqueName: \"kubernetes.io/projected/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-kube-api-access-5ftcw\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-systemd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-hostroot\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/200427fe-0d95-4f14-9c75-fa998acab9e6-ovn-node-metrics-cert\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtvd\" (UniqueName: \"kubernetes.io/projected/200427fe-0d95-4f14-9c75-fa998acab9e6-kube-api-access-kmtvd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849462 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-var-lib-kubelet\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-serviceca\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849536 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-daemon-config\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-os-release\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849696 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-os-release\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-system-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/804b57c1-49b2-4e56-8da1-70a591e070e2-tmp-dir\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.850351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/955d42b9-2b82-4faa-aa56-05c806e38889-konnectivity-ca\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.849997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-sys-fs\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-conf-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/955d42b9-2b82-4faa-aa56-05c806e38889-agent-certs\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-cnibin\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/955d42b9-2b82-4faa-aa56-05c806e38889-konnectivity-ca\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-sys-fs\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-device-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c2533fa-a070-45de-b5b1-0a51d7a87b22-iptables-alerter-script\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-netns\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-conf\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-systemd\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-socket-dir-parent\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.851113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-netns\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.850997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-socket-dir-parent\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-host-run-netns\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-conf-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-slash\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-conf\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-var-lib-kubelet\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-systemd\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/804b57c1-49b2-4e56-8da1-70a591e070e2-hosts-file\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80b2534b-c049-4a65-8cdb-fc90c54d1a82-multus-cni-dir\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851262 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b810562f-1e78-430b-bb52-4ddd48b17312-cnibin\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c2533fa-a070-45de-b5b1-0a51d7a87b22-host-slash\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e38306e8-30e0-42b4-889f-6950beb72b21-device-dir\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-binary-copy\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851409 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/804b57c1-49b2-4e56-8da1-70a591e070e2-hosts-file\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851436 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-etc-sysctl-d\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-run\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-var-lib-kubelet\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.851854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7cb\" (UniqueName: \"kubernetes.io/projected/804b57c1-49b2-4e56-8da1-70a591e070e2-kube-api-access-ms7cb\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-run\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-log-socket\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-config\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-lib-modules\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f04f579-459c-4246-9509-08688b2cebb7-lib-modules\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.851888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-host\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.852557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.852458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-host\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.853002 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.852980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-tmp\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.853508 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.853487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/955d42b9-2b82-4faa-aa56-05c806e38889-agent-certs\") pod \"konnectivity-agent-mcn8b\" (UID: \"955d42b9-2b82-4faa-aa56-05c806e38889\") " pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:31.853695 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.853677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b810562f-1e78-430b-bb52-4ddd48b17312-cni-binary-copy\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.855278 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.855245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g66d2\" (UniqueName: \"kubernetes.io/projected/963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18-kube-api-access-g66d2\") pod \"node-ca-msb4t\" (UID: \"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18\") " pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:31.856863 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.856840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f04f579-459c-4246-9509-08688b2cebb7-etc-tuned\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.857070 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.857056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6jn\" (UniqueName: \"kubernetes.io/projected/e38306e8-30e0-42b4-889f-6950beb72b21-kube-api-access-zz6jn\") pod \"aws-ebs-csi-driver-node-gvn9s\" (UID: \"e38306e8-30e0-42b4-889f-6950beb72b21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:31.857772 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.857735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6t6\" (UniqueName: \"kubernetes.io/projected/0f04f579-459c-4246-9509-08688b2cebb7-kube-api-access-bz6t6\") pod \"tuned-r4bks\" (UID: \"0f04f579-459c-4246-9509-08688b2cebb7\") " pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:31.857854 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.857829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpjs\" (UniqueName: \"kubernetes.io/projected/80b2534b-c049-4a65-8cdb-fc90c54d1a82-kube-api-access-6hpjs\") pod \"multus-r82dk\" (UID: \"80b2534b-c049-4a65-8cdb-fc90c54d1a82\") " pod="openshift-multus/multus-r82dk" Apr 17 20:15:31.858025 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.858009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npktn\" (UniqueName: \"kubernetes.io/projected/b810562f-1e78-430b-bb52-4ddd48b17312-kube-api-access-npktn\") pod \"multus-additional-cni-plugins-p25dr\" (UID: \"b810562f-1e78-430b-bb52-4ddd48b17312\") " pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:31.861231 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.861213 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7cb\" (UniqueName: \"kubernetes.io/projected/804b57c1-49b2-4e56-8da1-70a591e070e2-kube-api-access-ms7cb\") pod \"node-resolver-w9tk6\" (UID: \"804b57c1-49b2-4e56-8da1-70a591e070e2\") " pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:31.876548 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.876502 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" event={"ID":"2a95c6233e62f070e1ff4cac4a1fc713","Type":"ContainerStarted","Data":"f802c61a75a61db86f4df98232a32a97a274134eac253e22114000e20a8243f2"} Apr 17 20:15:31.877459 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.877436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" event={"ID":"a71794485a15783c46c26c3b9e1289ac","Type":"ContainerStarted","Data":"e00ab0affaa74a4334487e6f293b5ea5d1c86f4311680ce5d4d2305e06ee27b7"} Apr 17 20:15:31.953128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:31.953128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-node-log\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5hj\" (UniqueName: \"kubernetes.io/projected/6c2533fa-a070-45de-b5b1-0a51d7a87b22-kube-api-access-5k5hj\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.953128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftcw\" (UniqueName: \"kubernetes.io/projected/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-kube-api-access-5ftcw\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-systemd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/200427fe-0d95-4f14-9c75-fa998acab9e6-ovn-node-metrics-cert\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-node-log\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953194 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtvd\" (UniqueName: \"kubernetes.io/projected/200427fe-0d95-4f14-9c75-fa998acab9e6-kube-api-access-kmtvd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.953219 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c2533fa-a070-45de-b5b1-0a51d7a87b22-iptables-alerter-script\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.953311 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:32.453274835 +0000 UTC m=+2.118200522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-netns\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-slash\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-systemd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-netns\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c2533fa-a070-45de-b5b1-0a51d7a87b22-host-slash\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953445 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c2533fa-a070-45de-b5b1-0a51d7a87b22-host-slash\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.953454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-log-socket\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-slash\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-log-socket\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953476 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-config\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-var-lib-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953606 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-var-lib-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-kubelet\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-etc-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-kubelet\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-etc-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-systemd-units\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-script-lib\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-ovn\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-bin\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-openvswitch\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c2533fa-a070-45de-b5b1-0a51d7a87b22-iptables-alerter-script\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-systemd-units\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-env-overrides\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-bin\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-netd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.953957 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-host-cni-netd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.954008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-config\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.954068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/200427fe-0d95-4f14-9c75-fa998acab9e6-run-ovn\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.954248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-env-overrides\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.954523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.954266 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/200427fe-0d95-4f14-9c75-fa998acab9e6-ovnkube-script-lib\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.955631 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.955616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/200427fe-0d95-4f14-9c75-fa998acab9e6-ovn-node-metrics-cert\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.959457 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.959424 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:31.959457 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.959455 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:31.959608 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.959470 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:31.959608 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:31.959541 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:15:32.459521045 +0000 UTC m=+2.124446732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:31.961784 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.961762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtvd\" (UniqueName: \"kubernetes.io/projected/200427fe-0d95-4f14-9c75-fa998acab9e6-kube-api-access-kmtvd\") pod \"ovnkube-node-46w54\" (UID: \"200427fe-0d95-4f14-9c75-fa998acab9e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:31.961911 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.961893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5hj\" (UniqueName: \"kubernetes.io/projected/6c2533fa-a070-45de-b5b1-0a51d7a87b22-kube-api-access-5k5hj\") pod \"iptables-alerter-8j7zn\" (UID: \"6c2533fa-a070-45de-b5b1-0a51d7a87b22\") " pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:31.962215 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:31.962199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftcw\" (UniqueName: \"kubernetes.io/projected/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-kube-api-access-5ftcw\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:32.049984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.049947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r4bks" Apr 17 20:15:32.056259 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.056231 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f04f579_459c_4246_9509_08688b2cebb7.slice/crio-1af6e2a06ef81c4003ae48ac7592ede95b10dc0fcd7863d7dd1702bd210d8b4d WatchSource:0}: Error finding container 1af6e2a06ef81c4003ae48ac7592ede95b10dc0fcd7863d7dd1702bd210d8b4d: Status 404 returned error can't find the container with id 1af6e2a06ef81c4003ae48ac7592ede95b10dc0fcd7863d7dd1702bd210d8b4d Apr 17 20:15:32.059555 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.059534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" Apr 17 20:15:32.066395 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.066375 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38306e8_30e0_42b4_889f_6950beb72b21.slice/crio-87fd80fddc319282f203ebffe8466ebff9094ddd3b7fde89e5724aa331d27d08 WatchSource:0}: Error finding container 87fd80fddc319282f203ebffe8466ebff9094ddd3b7fde89e5724aa331d27d08: Status 404 returned error can't find the container with id 87fd80fddc319282f203ebffe8466ebff9094ddd3b7fde89e5724aa331d27d08 Apr 17 20:15:32.077091 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.077065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w9tk6" Apr 17 20:15:32.083016 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.082988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804b57c1_49b2_4e56_8da1_70a591e070e2.slice/crio-81fd5143e590d244c50fb4ee78b0b3c89a5fbee8bc679119d7a80b7cafa04a64 WatchSource:0}: Error finding container 81fd5143e590d244c50fb4ee78b0b3c89a5fbee8bc679119d7a80b7cafa04a64: Status 404 returned error can't find the container with id 81fd5143e590d244c50fb4ee78b0b3c89a5fbee8bc679119d7a80b7cafa04a64 Apr 17 20:15:32.092500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.092479 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-msb4t" Apr 17 20:15:32.099283 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.099259 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963f9ba4_1fc0_4858_aa10_5a4f1aaf9c18.slice/crio-6c35d1b48182295acdf1605fa6cddbd43a1d71bd791a94698342ebc7378874cb WatchSource:0}: Error finding container 6c35d1b48182295acdf1605fa6cddbd43a1d71bd791a94698342ebc7378874cb: Status 404 returned error can't find the container with id 6c35d1b48182295acdf1605fa6cddbd43a1d71bd791a94698342ebc7378874cb Apr 17 20:15:32.108020 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.107998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:32.113968 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.113930 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955d42b9_2b82_4faa_aa56_05c806e38889.slice/crio-8d4dfd648e6ceec5403feab4ff12d96f63798f64209545def72524ffd8572b11 WatchSource:0}: Error finding container 8d4dfd648e6ceec5403feab4ff12d96f63798f64209545def72524ffd8572b11: Status 404 returned error can't find the container with id 8d4dfd648e6ceec5403feab4ff12d96f63798f64209545def72524ffd8572b11 Apr 17 20:15:32.126538 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.126505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p25dr" Apr 17 20:15:32.133112 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.133087 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb810562f_1e78_430b_bb52_4ddd48b17312.slice/crio-5ebc848c34fce8ea7459e0f727c3deba2baec5c4d2fb42d0fd31dc505cde8ef6 WatchSource:0}: Error finding container 5ebc848c34fce8ea7459e0f727c3deba2baec5c4d2fb42d0fd31dc505cde8ef6: Status 404 returned error can't find the container with id 5ebc848c34fce8ea7459e0f727c3deba2baec5c4d2fb42d0fd31dc505cde8ef6 Apr 17 20:15:32.134699 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.134681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r82dk" Apr 17 20:15:32.140648 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.140628 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b2534b_c049_4a65_8cdb_fc90c54d1a82.slice/crio-466112346bc2dab990b41085da602505d856effb6066c01cac5012d6f8dd05ac WatchSource:0}: Error finding container 466112346bc2dab990b41085da602505d856effb6066c01cac5012d6f8dd05ac: Status 404 returned error can't find the container with id 466112346bc2dab990b41085da602505d856effb6066c01cac5012d6f8dd05ac Apr 17 20:15:32.145108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.145081 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8j7zn" Apr 17 20:15:32.151090 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.151071 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:32.151171 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.151158 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2533fa_a070_45de_b5b1_0a51d7a87b22.slice/crio-fa58bacf12d14babca24d2a1408e523008b0cc98150111431dc5f198a3013b48 WatchSource:0}: Error finding container fa58bacf12d14babca24d2a1408e523008b0cc98150111431dc5f198a3013b48: Status 404 returned error can't find the container with id fa58bacf12d14babca24d2a1408e523008b0cc98150111431dc5f198a3013b48 Apr 17 20:15:32.160066 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:15:32.160028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod200427fe_0d95_4f14_9c75_fa998acab9e6.slice/crio-3034c0309ebbbdd777bf23453fc3754ef201ebc47080ecf9821528dc21258d50 WatchSource:0}: Error finding container 3034c0309ebbbdd777bf23453fc3754ef201ebc47080ecf9821528dc21258d50: Status 404 returned error can't find the container with id 3034c0309ebbbdd777bf23453fc3754ef201ebc47080ecf9821528dc21258d50 Apr 17 20:15:32.457804 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.457772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:32.457990 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.457889 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:32.457990 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.457948 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:33.4579292 +0000 UTC m=+3.122854868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:32.558713 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.558672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:32.558909 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.558886 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:32.558909 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.558906 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:32.559022 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.558919 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:32.559022 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:32.558972 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:15:33.558955618 +0000 UTC m=+3.223881292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:32.560038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.560018 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:32.774237 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.774190 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:31 +0000 UTC" deadline="2028-01-24 07:19:58.34988875 +0000 UTC" Apr 17 20:15:32.774237 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.774234 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15515h4m25.575658518s" Apr 17 20:15:32.894476 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.892836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"3034c0309ebbbdd777bf23453fc3754ef201ebc47080ecf9821528dc21258d50"} Apr 17 20:15:32.898229 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.898191 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r82dk" event={"ID":"80b2534b-c049-4a65-8cdb-fc90c54d1a82","Type":"ContainerStarted","Data":"466112346bc2dab990b41085da602505d856effb6066c01cac5012d6f8dd05ac"} Apr 17 20:15:32.905259 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.905219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mcn8b" event={"ID":"955d42b9-2b82-4faa-aa56-05c806e38889","Type":"ContainerStarted","Data":"8d4dfd648e6ceec5403feab4ff12d96f63798f64209545def72524ffd8572b11"} Apr 17 20:15:32.911255 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.911203 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:32.916070 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.916036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-msb4t" event={"ID":"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18","Type":"ContainerStarted","Data":"6c35d1b48182295acdf1605fa6cddbd43a1d71bd791a94698342ebc7378874cb"} Apr 17 20:15:32.928795 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.928725 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" event={"ID":"e38306e8-30e0-42b4-889f-6950beb72b21","Type":"ContainerStarted","Data":"87fd80fddc319282f203ebffe8466ebff9094ddd3b7fde89e5724aa331d27d08"} Apr 17 20:15:32.953005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.952966 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8j7zn" event={"ID":"6c2533fa-a070-45de-b5b1-0a51d7a87b22","Type":"ContainerStarted","Data":"fa58bacf12d14babca24d2a1408e523008b0cc98150111431dc5f198a3013b48"} Apr 17 20:15:32.959145 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.959090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerStarted","Data":"5ebc848c34fce8ea7459e0f727c3deba2baec5c4d2fb42d0fd31dc505cde8ef6"} Apr 17 20:15:32.975613 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.975528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w9tk6" event={"ID":"804b57c1-49b2-4e56-8da1-70a591e070e2","Type":"ContainerStarted","Data":"81fd5143e590d244c50fb4ee78b0b3c89a5fbee8bc679119d7a80b7cafa04a64"} Apr 17 20:15:32.979651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.979593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r4bks" event={"ID":"0f04f579-459c-4246-9509-08688b2cebb7","Type":"ContainerStarted","Data":"1af6e2a06ef81c4003ae48ac7592ede95b10dc0fcd7863d7dd1702bd210d8b4d"} Apr 17 20:15:32.980177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:32.980155 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:33.468419 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.468380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:33.468595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.468568 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:33.468670 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.468634 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:35.468614862 +0000 UTC m=+5.133540538 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:33.569819 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.569780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:33.570011 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.569989 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:33.570102 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.570016 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:33.570102 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.570030 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:33.570102 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.570098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:15:35.57007903 +0000 UTC m=+5.235004721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:33.717150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.717113 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-smnrp"] Apr 17 20:15:33.720010 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.719949 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.720186 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.720046 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:33.771109 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.771072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-kubelet-config\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.771274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.771127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.771274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.771161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-dbus\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.774536 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.774494 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:31 +0000 UTC" deadline="2027-11-22 11:51:09.703510304 +0000 UTC" Apr 17 20:15:33.774536 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.774533 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14007h35m35.928981096s" Apr 17 20:15:33.872240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.872204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-kubelet-config\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.872240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.872256 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.872518 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.872294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-dbus\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.872518 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.872499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-dbus\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.872626 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.872567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-kubelet-config\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:33.872673 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.872664 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:33.872725 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.872720 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:15:34.372701858 +0000 UTC m=+4.037627534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:33.873199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.873175 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:33.873322 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.873298 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:33.873422 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:33.873408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:33.873515 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:33.873497 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:34.377344 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:34.377306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:34.377514 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:34.377468 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:34.377596 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:34.377532 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:15:35.377513572 +0000 UTC m=+5.042439253 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:34.877135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:34.876500 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:34.877135 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:34.876718 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:35.383999 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:35.383953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:35.384200 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.384169 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:35.384273 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.384221 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:15:37.384206956 +0000 UTC m=+7.049132624 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:35.485247 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:35.485146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:35.485424 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.485333 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:35.485424 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.485416 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:39.485393727 +0000 UTC m=+9.150319404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:35.586595 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:35.585884 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:35.586595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.586111 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:35.586595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.586133 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:35.586595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.586146 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:35.586595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.586207 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:15:39.586187765 +0000 UTC m=+9.251113447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:35.874236 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:35.874157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:35.874376 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.874296 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:35.874411 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:35.874375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:35.874484 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:35.874463 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:36.876009 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:36.875971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:36.876421 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:36.876102 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:37.402681 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:37.402640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:37.402879 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:37.402851 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:37.402949 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:37.402923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:15:41.402903196 +0000 UTC m=+11.067828915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:37.874113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:37.874028 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:37.874263 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:37.874172 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:37.874263 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:37.874176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:37.874383 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:37.874278 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:38.873398 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:38.873338 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:38.873906 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:38.873479 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:39.521042 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:39.521008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:39.521244 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.521191 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:39.521303 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.521256 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:47.521235448 +0000 UTC m=+17.186161130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:39.622360 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:39.622320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:39.622553 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.622537 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:39.622595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.622557 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:39.622595 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.622566 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:39.622691 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.622617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:15:47.622598809 +0000 UTC m=+17.287524478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:39.874081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:39.873991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:39.874498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:39.873992 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:39.874498 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.874302 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:39.874498 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:39.874313 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:40.874505 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:40.874470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:40.874987 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:40.874571 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:41.436120 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:41.436079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:41.436306 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:41.436238 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:41.436382 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:41.436313 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:15:49.436290743 +0000 UTC m=+19.101216424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:41.874155 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:41.874069 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:41.874321 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:41.874075 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:41.874321 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:41.874194 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:41.874413 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:41.874314 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:42.873681 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:42.873646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:42.874152 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:42.873786 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:43.874001 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:43.873964 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:43.874471 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:43.873964 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:43.874471 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:43.874094 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:43.874471 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:43.874189 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:44.874292 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:44.874251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:44.874771 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:44.874388 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:45.873422 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:45.873383 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:45.873589 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:45.873490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:45.873589 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:45.873552 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:45.873725 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:45.873677 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:46.874197 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:46.874153 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:46.874599 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:46.874295 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:47.579430 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:47.579389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:47.579653 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.579518 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:47.579653 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.579590 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:03.579571212 +0000 UTC m=+33.244496893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:47.680463 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:47.680425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:47.680649 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.680580 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:47.680649 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.680601 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:47.680649 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.680614 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:47.680805 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.680681 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:16:03.68066282 +0000 UTC m=+33.345588495 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:47.873271 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:47.873184 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:47.873411 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.873312 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:47.873411 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:47.873366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:47.873509 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:47.873489 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:48.873704 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:48.873659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:48.874232 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:48.873823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:49.495381 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:49.495331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:49.495554 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:49.495533 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:49.495629 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:49.495617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret podName:52c55f78-79f4-41d2-8ed3-1f214a05f8ae nodeName:}" failed. No retries permitted until 2026-04-17 20:16:05.49559786 +0000 UTC m=+35.160523532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret") pod "global-pull-secret-syncer-smnrp" (UID: "52c55f78-79f4-41d2-8ed3-1f214a05f8ae") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:49.873808 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:49.873722 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:49.873808 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:49.873766 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:49.874242 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:49.873858 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:49.874242 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:49.874023 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:50.875322 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:50.875016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:50.875780 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:50.875361 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:51.016776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.016724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r4bks" event={"ID":"0f04f579-459c-4246-9509-08688b2cebb7","Type":"ContainerStarted","Data":"b457010fba0ea79b34d196532956e5cdae36dfa6031c04b0f6aefdf7426b3044"} Apr 17 20:15:51.018166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.018137 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r82dk" event={"ID":"80b2534b-c049-4a65-8cdb-fc90c54d1a82","Type":"ContainerStarted","Data":"7ba62c26b10decd1e26b29a16909cbca47b22a8e0d91781cf2cef292ced7f66e"} Apr 17 20:15:51.019289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.019268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" event={"ID":"2a95c6233e62f070e1ff4cac4a1fc713","Type":"ContainerStarted","Data":"1419302be1fe1a6392ad1612b7e22d65675654e6b2f5037355444b9c92ac0957"} Apr 17 20:15:51.032536 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.032487 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r4bks" podStartSLOduration=2.770340039 podStartE2EDuration="21.032474372s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.058201387 +0000 UTC m=+1.723127056" lastFinishedPulling="2026-04-17 20:15:50.320335718 +0000 UTC m=+19.985261389" observedRunningTime="2026-04-17 20:15:51.032122274 +0000 UTC m=+20.697047963" watchObservedRunningTime="2026-04-17 20:15:51.032474372 +0000 UTC m=+20.697400059" Apr 17 20:15:51.047008 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.046964 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r82dk" podStartSLOduration=2.638463391 podStartE2EDuration="21.046950552s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.142118413 +0000 UTC m=+1.807044081" lastFinishedPulling="2026-04-17 20:15:50.550605569 +0000 UTC m=+20.215531242" observedRunningTime="2026-04-17 20:15:51.046460807 +0000 UTC m=+20.711386498" watchObservedRunningTime="2026-04-17 20:15:51.046950552 +0000 UTC m=+20.711876242" Apr 17 20:15:51.059467 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.059415 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-2.ec2.internal" podStartSLOduration=20.059393964 podStartE2EDuration="20.059393964s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:51.058371598 +0000 UTC m=+20.723297288" watchObservedRunningTime="2026-04-17 20:15:51.059393964 +0000 UTC m=+20.724319652" Apr 17 20:15:51.873811 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.873603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:51.874004 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:51.873603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:51.874004 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:51.873911 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:51.874004 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:51.873941 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:52.022535 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.022495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w9tk6" event={"ID":"804b57c1-49b2-4e56-8da1-70a591e070e2","Type":"ContainerStarted","Data":"244d195bac722b4a768a563e18962e44850902ba7a93a2dc5c1161d6098c0123"} Apr 17 20:15:52.024271 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.024242 2579 generic.go:358] "Generic (PLEG): container finished" podID="a71794485a15783c46c26c3b9e1289ac" containerID="7a9fdbf3e06f31a371c85e503833afc8e91c91159f57b041879ada2cfb7391fd" exitCode=0 Apr 17 20:15:52.024396 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.024341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" event={"ID":"a71794485a15783c46c26c3b9e1289ac","Type":"ContainerDied","Data":"7a9fdbf3e06f31a371c85e503833afc8e91c91159f57b041879ada2cfb7391fd"} Apr 17 20:15:52.027368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"63e7df00d66dfd606d33cd9f034c8eff9add1cf874d6591e973b7599bb3882a6"} Apr 17 20:15:52.027472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027376 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"8dd6b2b29626f6dd72d1e58f8d91a4f06dafc95c288f54323e3199634514ded6"} Apr 17 20:15:52.027472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"64be3068b18d28af9659440603ddd3de3f0db8510ed335fcbd5d7bdc5354c6f2"} Apr 17 20:15:52.027472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"e29a19bb7bf3d1b8b6fd8148ace966fc6097890d4c49c88ff2148ad974194caf"} Apr 17 20:15:52.027472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"5eff821f7a48695d23804152aa83564772cd5cfbe63b2e4710198a16abb07c64"} Apr 17 20:15:52.027472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.027427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"497856f2831275ebe32ec02647a9b7765e0f9a7fd286b07e79c5cb7e55a785b2"} Apr 17 20:15:52.028764 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.028705 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mcn8b" event={"ID":"955d42b9-2b82-4faa-aa56-05c806e38889","Type":"ContainerStarted","Data":"a5956f4818aada30adb7a2b89a509bc00352aac1a727a6877de4870482cd4a12"} Apr 17 20:15:52.030052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.030029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-msb4t" event={"ID":"963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18","Type":"ContainerStarted","Data":"5c57153b88341fe6079b8bc99002d9a628c82f84daf8e79423b33f86f06c3c29"} Apr 17 20:15:52.031392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.031370 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" event={"ID":"e38306e8-30e0-42b4-889f-6950beb72b21","Type":"ContainerStarted","Data":"63a49e8b3400d10eecfdf52ef80994e105e2ef4053387887c6e1282a88b5bfba"} Apr 17 20:15:52.033496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.032859 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="446de0d7b5be86083bba55dfa2022b2e25c7f2d3042d3095920d29d7123eba0f" exitCode=0 Apr 17 20:15:52.033496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.033140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"446de0d7b5be86083bba55dfa2022b2e25c7f2d3042d3095920d29d7123eba0f"} Apr 17 20:15:52.039074 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.039005 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w9tk6" podStartSLOduration=3.82913149 podStartE2EDuration="22.038989819s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.084468929 +0000 UTC m=+1.749394596" lastFinishedPulling="2026-04-17 20:15:50.294327252 +0000 UTC m=+19.959252925" observedRunningTime="2026-04-17 20:15:52.038886299 +0000 UTC m=+21.703811990" watchObservedRunningTime="2026-04-17 20:15:52.038989819 +0000 UTC m=+21.703915501" Apr 17 20:15:52.056095 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.056046 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mcn8b" podStartSLOduration=3.87715165 podStartE2EDuration="22.056027846s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.115438128 +0000 UTC m=+1.780363798" lastFinishedPulling="2026-04-17 20:15:50.294314313 +0000 UTC m=+19.959239994" observedRunningTime="2026-04-17 20:15:52.055590941 +0000 UTC m=+21.720516633" watchObservedRunningTime="2026-04-17 20:15:52.056027846 +0000 UTC m=+21.720953537" Apr 17 20:15:52.083179 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.083118 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-msb4t" podStartSLOduration=11.99774321 podStartE2EDuration="22.083099197s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.100797281 +0000 UTC m=+1.765722951" lastFinishedPulling="2026-04-17 20:15:42.186153265 +0000 UTC m=+11.851078938" observedRunningTime="2026-04-17 20:15:52.082684231 +0000 UTC m=+21.747609921" watchObservedRunningTime="2026-04-17 20:15:52.083099197 +0000 UTC m=+21.748024890" Apr 17 20:15:52.500078 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.500051 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:15:52.809091 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.808929 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:15:52.500070215Z","UUID":"3f65f636-f019-4ac6-b41c-ca3bab49b548","Handler":null,"Name":"","Endpoint":""} Apr 17 20:15:52.812357 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.812334 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:15:52.812357 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.812362 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:15:52.873685 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:52.873649 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:52.873887 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:52.873804 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:53.036490 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.036447 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" event={"ID":"e38306e8-30e0-42b4-889f-6950beb72b21","Type":"ContainerStarted","Data":"3de29ff3599f0ac706abfb746d81f2f3c973bfa879e057c39d34ff4ec591f4a3"} Apr 17 20:15:53.038776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.038382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8j7zn" event={"ID":"6c2533fa-a070-45de-b5b1-0a51d7a87b22","Type":"ContainerStarted","Data":"fd4f159d7205784577c71372d588bbfa4c07fe71bfade28bf59c9a8675bb60cf"} Apr 17 20:15:53.040482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.040454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" event={"ID":"a71794485a15783c46c26c3b9e1289ac","Type":"ContainerStarted","Data":"d034dfc11457be1ba53c194f8356f13aebe1b0a4b1501e11d54ad1f8c012bd4a"} Apr 17 20:15:53.052356 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.052313 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8j7zn" podStartSLOduration=4.885952642 podStartE2EDuration="23.052299409s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.154598473 +0000 UTC m=+1.819524141" lastFinishedPulling="2026-04-17 20:15:50.320945239 +0000 UTC m=+19.985870908" observedRunningTime="2026-04-17 20:15:53.051951916 +0000 UTC m=+22.716877606" watchObservedRunningTime="2026-04-17 20:15:53.052299409 +0000 UTC m=+22.717225098" Apr 17 20:15:53.066425 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.066327 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-2.ec2.internal" podStartSLOduration=22.066310587 podStartE2EDuration="22.066310587s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:53.065926017 +0000 UTC m=+22.730851706" watchObservedRunningTime="2026-04-17 20:15:53.066310587 +0000 UTC m=+22.731236277" Apr 17 20:15:53.874065 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.873846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:53.874316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.873903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:53.874316 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:53.874165 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:53.874316 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:53.874241 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:53.895892 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.895854 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:53.896560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:53.896540 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:54.045723 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.045686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"250dfda35792588164c25b58e6e38b1cc419accf0c64ffa166a8065a5f06f253"} Apr 17 20:15:54.047828 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.047800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" event={"ID":"e38306e8-30e0-42b4-889f-6950beb72b21","Type":"ContainerStarted","Data":"e2b713b8e87371ef8c2e4883f26ef2fa1142c02e50d189be81086e8a1b629597"} Apr 17 20:15:54.048507 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.048335 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:54.048507 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.048487 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mcn8b" Apr 17 20:15:54.063523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.063467 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gvn9s" podStartSLOduration=2.860559674 podStartE2EDuration="24.06345297s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.068000248 +0000 UTC m=+1.732925919" lastFinishedPulling="2026-04-17 20:15:53.270893537 +0000 UTC m=+22.935819215" observedRunningTime="2026-04-17 20:15:54.063043747 +0000 UTC m=+23.727969438" watchObservedRunningTime="2026-04-17 20:15:54.06345297 +0000 UTC m=+23.728378660" Apr 17 20:15:54.874289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:54.874262 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:54.874476 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:54.874390 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:55.874156 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:55.874122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:55.874156 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:55.874122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:55.875015 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:55.874252 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:55.875015 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:55.874385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:56.873674 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:56.873488 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:56.873889 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:56.873739 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:57.055420 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.055379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" event={"ID":"200427fe-0d95-4f14-9c75-fa998acab9e6","Type":"ContainerStarted","Data":"c96bb2ed68d1be13d2847b9328e7af651a3730fec8b78503dae38cb9f9bae9ad"} Apr 17 20:15:57.056081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.055727 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:57.056081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.055777 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:57.057651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.057626 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="9f671ed15fe797c51cd8fb9e28996c380ca7ef83ef0d6c7a9b41c010ff79cda9" exitCode=0 Apr 17 20:15:57.057803 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.057665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"9f671ed15fe797c51cd8fb9e28996c380ca7ef83ef0d6c7a9b41c010ff79cda9"} Apr 17 20:15:57.071842 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.071818 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:57.102718 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.102676 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" podStartSLOduration=7.298278897 podStartE2EDuration="26.102659715s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.161880574 +0000 UTC m=+1.826806255" lastFinishedPulling="2026-04-17 20:15:50.966261405 +0000 UTC m=+20.631187073" observedRunningTime="2026-04-17 20:15:57.083301243 +0000 UTC m=+26.748226942" watchObservedRunningTime="2026-04-17 20:15:57.102659715 +0000 UTC m=+26.767585387" Apr 17 20:15:57.873715 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.873683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:57.873715 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:57.873706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:57.873920 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:57.873824 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:57.873985 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:57.873961 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:58.061968 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.061719 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="780528f185a0744e3fcea190b9010a3ccd027884ca8fd4a57ec260018261af61" exitCode=0 Apr 17 20:15:58.061968 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.061779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"780528f185a0744e3fcea190b9010a3ccd027884ca8fd4a57ec260018261af61"} Apr 17 20:15:58.062947 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.062465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:58.078938 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.078912 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:15:58.167598 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.167565 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-842wl"] Apr 17 20:15:58.167795 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.167714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:58.167884 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:58.167856 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:15:58.168337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.168320 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-smnrp"] Apr 17 20:15:58.168420 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.168409 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:58.168499 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:58.168483 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:58.169501 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.169480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q6spr"] Apr 17 20:15:58.169597 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:58.169581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:58.169690 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:58.169673 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:59.065933 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:59.065851 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="d8fa24c44391b1d65f9554a3756db63787da79d21d639cb2102f2ca854d8aa11" exitCode=0 Apr 17 20:15:59.066284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:59.065937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"d8fa24c44391b1d65f9554a3756db63787da79d21d639cb2102f2ca854d8aa11"} Apr 17 20:15:59.873930 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:59.873894 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:15:59.873930 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:59.873936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:15:59.874150 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:59.874020 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:15:59.874150 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:59.874046 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:15:59.874150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:15:59.874090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:15:59.874316 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:15:59.874167 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:16:01.873482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:01.873436 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:16:01.874073 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:01.873545 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-842wl" podUID="fcb80713-90b2-4ae8-95b5-a07c24ab45e2" Apr 17 20:16:01.874073 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:01.873552 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:16:01.874073 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:01.873575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:01.874073 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:01.873639 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-smnrp" podUID="52c55f78-79f4-41d2-8ed3-1f214a05f8ae" Apr 17 20:16:01.874073 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:01.873703 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q6spr" podUID="e114821c-4bf6-4656-8172-0f7ba8948fdc" Apr 17 20:16:03.597200 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.597158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:16:03.597715 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.597317 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:16:03.597715 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.597394 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:35.597373378 +0000 UTC m=+65.262299047 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:16:03.631544 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.631516 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-2.ec2.internal" event="NodeReady" Apr 17 20:16:03.631714 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.631661 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:16:03.674630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.674592 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vkvbf"] Apr 17 20:16:03.698323 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.698290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:03.698501 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.698478 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:16:03.698555 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.698510 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:16:03.698555 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.698526 2579 projected.go:194] Error preparing data for projected volume kube-api-access-2bczm for pod openshift-network-diagnostics/network-check-target-q6spr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:16:03.698648 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.698595 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm podName:e114821c-4bf6-4656-8172-0f7ba8948fdc nodeName:}" failed. No retries permitted until 2026-04-17 20:16:35.698573324 +0000 UTC m=+65.363499013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bczm" (UniqueName: "kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm") pod "network-check-target-q6spr" (UID: "e114821c-4bf6-4656-8172-0f7ba8948fdc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:16:03.712172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.712140 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zvwgs"] Apr 17 20:16:03.712352 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.712317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.714865 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.714823 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:16:03.715222 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.715201 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:16:03.715327 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.715212 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:16:03.729932 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.729912 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkvbf"] Apr 17 20:16:03.729932 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.729938 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zvwgs"] Apr 17 20:16:03.730094 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.730041 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.732224 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.732206 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:16:03.732347 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.732235 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:16:03.732489 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.732463 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:16:03.732618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.732602 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:16:03.799260 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a083116d-3c26-492c-b99a-c51bbaa51aa4-tmp-dir\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.799260 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp8mn\" (UniqueName: \"kubernetes.io/projected/a083116d-3c26-492c-b99a-c51bbaa51aa4-kube-api-access-xp8mn\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.799466 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a083116d-3c26-492c-b99a-c51bbaa51aa4-config-volume\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.799466 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p448\" (UniqueName: \"kubernetes.io/projected/159a3a3e-608e-405f-ac09-ff7186a9c710-kube-api-access-9p448\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.799466 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799444 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.799574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.799482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.873638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.873594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:16:03.873867 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.873717 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:03.874170 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.874153 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:16:03.876464 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876441 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:16:03.876464 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876458 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:16:03.876623 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876512 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:16:03.876623 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876464 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:16:03.876759 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876731 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:16:03.876873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.876735 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:16:03.900505 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p448\" (UniqueName: \"kubernetes.io/projected/159a3a3e-608e-405f-ac09-ff7186a9c710-kube-api-access-9p448\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.900658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.900658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.900658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a083116d-3c26-492c-b99a-c51bbaa51aa4-tmp-dir\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.900658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xp8mn\" (UniqueName: \"kubernetes.io/projected/a083116d-3c26-492c-b99a-c51bbaa51aa4-kube-api-access-xp8mn\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.900866 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a083116d-3c26-492c-b99a-c51bbaa51aa4-config-volume\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.900866 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.900701 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:03.900866 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.900777 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:04.400738031 +0000 UTC m=+34.065663699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:03.901032 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.900926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a083116d-3c26-492c-b99a-c51bbaa51aa4-tmp-dir\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.901032 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.900965 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:03.901032 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:03.901020 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:04.401000859 +0000 UTC m=+34.065926528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:03.901224 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.901206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a083116d-3c26-492c-b99a-c51bbaa51aa4-config-volume\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:03.911997 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.911831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p448\" (UniqueName: \"kubernetes.io/projected/159a3a3e-608e-405f-ac09-ff7186a9c710-kube-api-access-9p448\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:03.921043 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:03.921019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp8mn\" (UniqueName: \"kubernetes.io/projected/a083116d-3c26-492c-b99a-c51bbaa51aa4-kube-api-access-xp8mn\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:04.405392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:04.405343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:04.405642 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:04.405474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:04.405642 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:04.405476 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:04.405642 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:04.405541 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:04.405642 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:04.405582 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:05.405561445 +0000 UTC m=+35.070487115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:04.405642 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:04.405616 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:05.405597341 +0000 UTC m=+35.070523009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:05.414069 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.414030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:05.414069 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.414082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:05.414506 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:05.414185 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:05.414506 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:05.414206 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:05.414506 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:05.414251 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.414235041 +0000 UTC m=+37.079160710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:05.414506 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:05.414265 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.414258704 +0000 UTC m=+37.079184371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:05.514549 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.514505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:16:05.527442 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.527413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/52c55f78-79f4-41d2-8ed3-1f214a05f8ae-original-pull-secret\") pod \"global-pull-secret-syncer-smnrp\" (UID: \"52c55f78-79f4-41d2-8ed3-1f214a05f8ae\") " pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:16:05.686484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.686369 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-smnrp" Apr 17 20:16:05.828095 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:05.828067 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-smnrp"] Apr 17 20:16:05.833258 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:16:05.833230 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c55f78_79f4_41d2_8ed3_1f214a05f8ae.slice/crio-8cde8b2899eab4c849cdbaf6af922cff93a8b83fae7810abb775e416f5f36597 WatchSource:0}: Error finding container 8cde8b2899eab4c849cdbaf6af922cff93a8b83fae7810abb775e416f5f36597: Status 404 returned error can't find the container with id 8cde8b2899eab4c849cdbaf6af922cff93a8b83fae7810abb775e416f5f36597 Apr 17 20:16:06.083271 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:06.083230 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="37149caff4c46bc444466f4da6276d649cb0e97d1e97f53ba4e6e4db46719044" exitCode=0 Apr 17 20:16:06.083425 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:06.083290 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"37149caff4c46bc444466f4da6276d649cb0e97d1e97f53ba4e6e4db46719044"} Apr 17 20:16:06.084403 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:06.084382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-smnrp" event={"ID":"52c55f78-79f4-41d2-8ed3-1f214a05f8ae","Type":"ContainerStarted","Data":"8cde8b2899eab4c849cdbaf6af922cff93a8b83fae7810abb775e416f5f36597"} Apr 17 20:16:07.090463 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:07.090180 2579 generic.go:358] "Generic (PLEG): container finished" podID="b810562f-1e78-430b-bb52-4ddd48b17312" containerID="ca2e69bae8d499de1f336e87ae4bca6ac7e7220bbedebeb39d4c7ebca3425219" exitCode=0 Apr 17 20:16:07.090899 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:07.090258 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerDied","Data":"ca2e69bae8d499de1f336e87ae4bca6ac7e7220bbedebeb39d4c7ebca3425219"} Apr 17 20:16:07.428035 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:07.427995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:07.428223 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:07.428056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:07.428223 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:07.428165 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:07.428332 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:07.428245 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:11.428219265 +0000 UTC m=+41.093144944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:07.428332 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:07.428167 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:07.428332 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:07.428319 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:11.428300485 +0000 UTC m=+41.093226153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:08.096596 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:08.096511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p25dr" event={"ID":"b810562f-1e78-430b-bb52-4ddd48b17312","Type":"ContainerStarted","Data":"2a3189080576ddeb7c67eb4831f0da865d65dce707015eec6293e22b3bb7734d"} Apr 17 20:16:08.118411 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:08.118352 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p25dr" podStartSLOduration=4.537887816 podStartE2EDuration="38.11829952s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:32.134565288 +0000 UTC m=+1.799490969" lastFinishedPulling="2026-04-17 20:16:05.714976985 +0000 UTC m=+35.379902673" observedRunningTime="2026-04-17 20:16:08.11700197 +0000 UTC m=+37.781927660" watchObservedRunningTime="2026-04-17 20:16:08.11829952 +0000 UTC m=+37.783225212" Apr 17 20:16:11.106188 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:11.106148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-smnrp" event={"ID":"52c55f78-79f4-41d2-8ed3-1f214a05f8ae","Type":"ContainerStarted","Data":"a49be4f33f0ad173ef9967cd18bf0c7be3872391098dde52822322ae6d348ca6"} Apr 17 20:16:11.122915 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:11.122852 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-smnrp" podStartSLOduration=33.070116908 podStartE2EDuration="38.122832782s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:16:05.834933441 +0000 UTC m=+35.499859109" lastFinishedPulling="2026-04-17 20:16:10.887649315 +0000 UTC m=+40.552574983" observedRunningTime="2026-04-17 20:16:11.122300953 +0000 UTC m=+40.787226661" watchObservedRunningTime="2026-04-17 20:16:11.122832782 +0000 UTC m=+40.787758468" Apr 17 20:16:11.459715 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:11.459666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:11.459715 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:11.459723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:11.459965 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:11.459840 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:11.459965 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:11.459839 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:11.459965 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:11.459905 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:19.459887488 +0000 UTC m=+49.124813155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:11.459965 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:11.459923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:19.4599146 +0000 UTC m=+49.124840268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:19.526383 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:19.526340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:19.526383 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:19.526390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:19.526945 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:19.526494 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:19.526945 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:19.526504 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:19.526945 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:19.526549 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:35.526534409 +0000 UTC m=+65.191460078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:19.526945 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:19.526573 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:35.52655464 +0000 UTC m=+65.191480313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:30.079060 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:30.079031 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46w54" Apr 17 20:16:35.542424 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.542385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:16:35.542424 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.542429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:16:35.542946 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.542542 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:35.542946 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.542561 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:35.542946 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.542599 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:07.542585275 +0000 UTC m=+97.207510942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:16:35.542946 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.542638 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:07.542619674 +0000 UTC m=+97.207545356 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:16:35.643218 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.643184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:16:35.645779 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.645740 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:16:35.653679 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.653661 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:16:35.653729 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:16:35.653718 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs podName:fcb80713-90b2-4ae8-95b5-a07c24ab45e2 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:39.653701936 +0000 UTC m=+129.318627604 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs") pod "network-metrics-daemon-842wl" (UID: "fcb80713-90b2-4ae8-95b5-a07c24ab45e2") : secret "metrics-daemon-secret" not found Apr 17 20:16:35.744320 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.744279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:35.746608 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.746590 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:16:35.758331 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.758306 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:16:35.779201 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.779168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bczm\" (UniqueName: \"kubernetes.io/projected/e114821c-4bf6-4656-8172-0f7ba8948fdc-kube-api-access-2bczm\") pod \"network-check-target-q6spr\" (UID: \"e114821c-4bf6-4656-8172-0f7ba8948fdc\") " pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:35.997023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:35.996996 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:16:36.004779 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:36.004733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:36.120450 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:36.120419 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q6spr"] Apr 17 20:16:36.124392 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:16:36.124349 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode114821c_4bf6_4656_8172_0f7ba8948fdc.slice/crio-f42d327d8a241cf8f95e44720cfd03c9b9566ad4bea71f2725275eb58670f7d6 WatchSource:0}: Error finding container f42d327d8a241cf8f95e44720cfd03c9b9566ad4bea71f2725275eb58670f7d6: Status 404 returned error can't find the container with id f42d327d8a241cf8f95e44720cfd03c9b9566ad4bea71f2725275eb58670f7d6 Apr 17 20:16:36.153287 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:36.153251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q6spr" event={"ID":"e114821c-4bf6-4656-8172-0f7ba8948fdc","Type":"ContainerStarted","Data":"f42d327d8a241cf8f95e44720cfd03c9b9566ad4bea71f2725275eb58670f7d6"} Apr 17 20:16:39.159553 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:39.159509 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q6spr" event={"ID":"e114821c-4bf6-4656-8172-0f7ba8948fdc","Type":"ContainerStarted","Data":"859f857191e277c8dfd98b8e926237b847c4b47e83f5fdc425384779a078fc10"} Apr 17 20:16:39.160005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:39.159653 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:16:39.173019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:16:39.172912 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-q6spr" podStartSLOduration=66.386922995 podStartE2EDuration="1m9.172900148s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:16:36.126355793 +0000 UTC m=+65.791281462" lastFinishedPulling="2026-04-17 20:16:38.912332931 +0000 UTC m=+68.577258615" observedRunningTime="2026-04-17 20:16:39.172415095 +0000 UTC m=+68.837340786" watchObservedRunningTime="2026-04-17 20:16:39.172900148 +0000 UTC m=+68.837825839" Apr 17 20:17:02.157397 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.157267 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8"] Apr 17 20:17:02.161472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.161450 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" Apr 17 20:17:02.164103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.164076 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.165044 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.165017 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.165256 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.165233 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-whsws\"" Apr 17 20:17:02.166119 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.166090 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8cv7h"] Apr 17 20:17:02.172299 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.172274 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8"] Apr 17 20:17:02.172410 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.172397 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.174835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.174811 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 20:17:02.174835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.174830 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 20:17:02.174989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.174842 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.174989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.174853 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.174989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.174902 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-2xkxl\"" Apr 17 20:17:02.180268 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.180236 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8cv7h"] Apr 17 20:17:02.180584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.180564 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 20:17:02.235952 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.235911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-trusted-ca\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.236150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.235979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fe830e-bccc-4359-9b7c-afa06ecd5668-serving-cert\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.236150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.236000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpc4\" (UniqueName: \"kubernetes.io/projected/d7fe830e-bccc-4359-9b7c-afa06ecd5668-kube-api-access-vxpc4\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.236150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.236030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-config\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.236150 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.236073 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6sv\" (UniqueName: \"kubernetes.io/projected/1b176a9f-c210-49e2-9286-ec8df6440b2b-kube-api-access-kg6sv\") pod \"volume-data-source-validator-7c6cbb6c87-9k5w8\" (UID: \"1b176a9f-c210-49e2-9286-ec8df6440b2b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" Apr 17 20:17:02.259250 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.259214 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp"] Apr 17 20:17:02.262055 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.262038 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.264461 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.264437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.264618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.264437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 20:17:02.264618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.264472 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 20:17:02.264618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.264478 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.264976 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.264938 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-p4hrc\"" Apr 17 20:17:02.266328 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.266308 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp"] Apr 17 20:17:02.268944 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.268928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45"] Apr 17 20:17:02.269113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.269086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.271237 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.271201 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 20:17:02.271673 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.271660 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.271857 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.271844 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:02.272078 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.272062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.272800 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.272765 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.272989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.272970 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hn2bx\"" Apr 17 20:17:02.273087 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.273042 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 20:17:02.274054 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274035 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-tqvp4\"" Apr 17 20:17:02.274135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274124 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 20:17:02.274258 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274243 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.274372 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274355 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.274619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274600 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 20:17:02.274702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274610 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gw79k"] Apr 17 20:17:02.274773 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.274759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.276833 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.276817 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:17:02.277049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.277033 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:17:02.277216 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.277193 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tkw2s\"" Apr 17 20:17:02.277307 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.277220 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:17:02.278024 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.277990 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp"] Apr 17 20:17:02.278129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.278113 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.281764 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.281721 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.282081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.282063 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 20:17:02.282608 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.282568 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.282608 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.282584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 20:17:02.282930 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.282915 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x4662\"" Apr 17 20:17:02.284485 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.284464 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45"] Apr 17 20:17:02.288148 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.288129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp"] Apr 17 20:17:02.288211 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.288176 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gw79k"] Apr 17 20:17:02.289808 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.289788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:17:02.292560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.292535 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 20:17:02.295697 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.295662 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:02.336609 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-snapshots\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.336609 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbqv\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpc4\" (UniqueName: \"kubernetes.io/projected/d7fe830e-bccc-4359-9b7c-afa06ecd5668-kube-api-access-vxpc4\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzrmn\" (UniqueName: \"kubernetes.io/projected/1e417380-f1cf-4d7a-b044-4fb0022ce22c-kube-api-access-dzrmn\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c9bec13-455f-46f1-b0d0-62183c8c00c7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-service-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/370bc716-d50c-4672-a0c1-cb7135fa9ce7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336735 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8rq\" (UniqueName: \"kubernetes.io/projected/370bc716-d50c-4672-a0c1-cb7135fa9ce7-kube-api-access-xt8rq\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e417380-f1cf-4d7a-b044-4fb0022ce22c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.336873 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e417380-f1cf-4d7a-b044-4fb0022ce22c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee130621-350a-49cf-905b-3a5917dcd327-serving-cert\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.336957 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fe830e-bccc-4359-9b7c-afa06ecd5668-serving-cert\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337049 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-tmp\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-config\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370bc716-d50c-4672-a0c1-cb7135fa9ce7-config\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6sv\" (UniqueName: \"kubernetes.io/projected/1b176a9f-c210-49e2-9286-ec8df6440b2b-kube-api-access-kg6sv\") pod \"volume-data-source-validator-7c6cbb6c87-9k5w8\" (UID: \"1b176a9f-c210-49e2-9286-ec8df6440b2b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337245 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.337776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stt27\" (UniqueName: \"kubernetes.io/projected/ee130621-350a-49cf-905b-3a5917dcd327-kube-api-access-stt27\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.337776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cv7k\" (UniqueName: \"kubernetes.io/projected/2c9bec13-455f-46f1-b0d0-62183c8c00c7-kube-api-access-2cv7k\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.337776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.337776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.337560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-trusted-ca\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.338082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.338062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-config\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.338202 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.338187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7fe830e-bccc-4359-9b7c-afa06ecd5668-trusted-ca\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.339601 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.339582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fe830e-bccc-4359-9b7c-afa06ecd5668-serving-cert\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.349801 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.349778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpc4\" (UniqueName: \"kubernetes.io/projected/d7fe830e-bccc-4359-9b7c-afa06ecd5668-kube-api-access-vxpc4\") pod \"console-operator-9d4b6777b-8cv7h\" (UID: \"d7fe830e-bccc-4359-9b7c-afa06ecd5668\") " pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.349931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.349827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6sv\" (UniqueName: \"kubernetes.io/projected/1b176a9f-c210-49e2-9286-ec8df6440b2b-kube-api-access-kg6sv\") pod \"volume-data-source-validator-7c6cbb6c87-9k5w8\" (UID: \"1b176a9f-c210-49e2-9286-ec8df6440b2b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" Apr 17 20:17:02.437949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.437865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.437949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.437903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.437949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.437937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stt27\" (UniqueName: \"kubernetes.io/projected/ee130621-350a-49cf-905b-3a5917dcd327-kube-api-access-stt27\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.437958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cv7k\" (UniqueName: \"kubernetes.io/projected/2c9bec13-455f-46f1-b0d0-62183c8c00c7-kube-api-access-2cv7k\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.437983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-snapshots\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbqv\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzrmn\" (UniqueName: \"kubernetes.io/projected/1e417380-f1cf-4d7a-b044-4fb0022ce22c-kube-api-access-dzrmn\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c9bec13-455f-46f1-b0d0-62183c8c00c7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-service-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/370bc716-d50c-4672-a0c1-cb7135fa9ce7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8rq\" (UniqueName: \"kubernetes.io/projected/370bc716-d50c-4672-a0c1-cb7135fa9ce7-kube-api-access-xt8rq\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.438205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e417380-f1cf-4d7a-b044-4fb0022ce22c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438256 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e417380-f1cf-4d7a-b044-4fb0022ce22c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee130621-350a-49cf-905b-3a5917dcd327-serving-cert\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-tmp\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.438706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-tmp\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.439022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ee130621-350a-49cf-905b-3a5917dcd327-snapshots\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.439022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.438955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e417380-f1cf-4d7a-b044-4fb0022ce22c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.439022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.439172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.439172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370bc716-d50c-4672-a0c1-cb7135fa9ce7-config\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.439172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c9bec13-455f-46f1-b0d0-62183c8c00c7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.439200 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.439206 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.439215 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f44f558f-sgqx4: secret "image-registry-tls" not found Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-service-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.439275 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:02.939257983 +0000 UTC m=+92.604183658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:02.439313 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.439619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.439619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.439548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee130621-350a-49cf-905b-3a5917dcd327-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.440209 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.440182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370bc716-d50c-4672-a0c1-cb7135fa9ce7-config\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.440327 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.440257 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls podName:6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec nodeName:}" failed. No retries permitted until 2026-04-17 20:17:02.940239026 +0000 UTC m=+92.605164693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls") pod "image-registry-54f44f558f-sgqx4" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec") : secret "image-registry-tls" not found Apr 17 20:17:02.440535 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.440510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.440800 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.440778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.441271 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.441249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.441615 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.441592 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee130621-350a-49cf-905b-3a5917dcd327-serving-cert\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.442007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.441985 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e417380-f1cf-4d7a-b044-4fb0022ce22c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.442144 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.442122 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.442196 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.442127 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.443003 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.442986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/370bc716-d50c-4672-a0c1-cb7135fa9ce7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.450761 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.450707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8rq\" (UniqueName: \"kubernetes.io/projected/370bc716-d50c-4672-a0c1-cb7135fa9ce7-kube-api-access-xt8rq\") pod \"service-ca-operator-d6fc45fc5-wqz45\" (UID: \"370bc716-d50c-4672-a0c1-cb7135fa9ce7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.450940 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.450899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzrmn\" (UniqueName: \"kubernetes.io/projected/1e417380-f1cf-4d7a-b044-4fb0022ce22c-kube-api-access-dzrmn\") pod \"kube-storage-version-migrator-operator-6769c5d45-tpfmp\" (UID: \"1e417380-f1cf-4d7a-b044-4fb0022ce22c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.451049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.450997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stt27\" (UniqueName: \"kubernetes.io/projected/ee130621-350a-49cf-905b-3a5917dcd327-kube-api-access-stt27\") pod \"insights-operator-585dfdc468-gw79k\" (UID: \"ee130621-350a-49cf-905b-3a5917dcd327\") " pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.451281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.451264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cv7k\" (UniqueName: \"kubernetes.io/projected/2c9bec13-455f-46f1-b0d0-62183c8c00c7-kube-api-access-2cv7k\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.452382 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.452363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.454343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.454322 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbqv\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.473023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.472986 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" Apr 17 20:17:02.482783 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.482755 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:02.572441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.572289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" Apr 17 20:17:02.592927 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.592879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" Apr 17 20:17:02.600434 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.600403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8"] Apr 17 20:17:02.604223 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:02.604186 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b176a9f_c210_49e2_9286_ec8df6440b2b.slice/crio-30d6a4b41c36bfc4d5f80f0186c8aa91d91305413c36f7207f02570a172e5f94 WatchSource:0}: Error finding container 30d6a4b41c36bfc4d5f80f0186c8aa91d91305413c36f7207f02570a172e5f94: Status 404 returned error can't find the container with id 30d6a4b41c36bfc4d5f80f0186c8aa91d91305413c36f7207f02570a172e5f94 Apr 17 20:17:02.605439 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.605401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gw79k" Apr 17 20:17:02.609205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.609182 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5"] Apr 17 20:17:02.614335 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.614303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" Apr 17 20:17:02.616576 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.616559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-hj9wt\"" Apr 17 20:17:02.618700 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.618654 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5"] Apr 17 20:17:02.630008 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.622941 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8cv7h"] Apr 17 20:17:02.631935 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:02.631900 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7fe830e_bccc_4359_9b7c_afa06ecd5668.slice/crio-7c6cfc806f6ac7cd39218ebd5d8d7bc2574f099e69140a5672aa1841772fcc31 WatchSource:0}: Error finding container 7c6cfc806f6ac7cd39218ebd5d8d7bc2574f099e69140a5672aa1841772fcc31: Status 404 returned error can't find the container with id 7c6cfc806f6ac7cd39218ebd5d8d7bc2574f099e69140a5672aa1841772fcc31 Apr 17 20:17:02.711420 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.711355 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp"] Apr 17 20:17:02.716669 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:02.716527 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e417380_f1cf_4d7a_b044_4fb0022ce22c.slice/crio-c7708b5da8f262671fc652fb0d1cb08d106e59998f3104057fc25ac2eb8ca603 WatchSource:0}: Error finding container c7708b5da8f262671fc652fb0d1cb08d106e59998f3104057fc25ac2eb8ca603: Status 404 returned error can't find the container with id c7708b5da8f262671fc652fb0d1cb08d106e59998f3104057fc25ac2eb8ca603 Apr 17 20:17:02.743178 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.743140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7l5h\" (UniqueName: \"kubernetes.io/projected/ca2ad4aa-e46d-428a-8b10-9d150c00e450-kube-api-access-k7l5h\") pod \"network-check-source-8894fc9bd-tdbc5\" (UID: \"ca2ad4aa-e46d-428a-8b10-9d150c00e450\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" Apr 17 20:17:02.745736 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.745706 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45"] Apr 17 20:17:02.748398 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:02.748371 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod370bc716_d50c_4672_a0c1_cb7135fa9ce7.slice/crio-696a23d81d62cc2fc7e9098f09ef06bf60b312a425f6fd0e337d7f81d0ac573f WatchSource:0}: Error finding container 696a23d81d62cc2fc7e9098f09ef06bf60b312a425f6fd0e337d7f81d0ac573f: Status 404 returned error can't find the container with id 696a23d81d62cc2fc7e9098f09ef06bf60b312a425f6fd0e337d7f81d0ac573f Apr 17 20:17:02.761277 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.761247 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gw79k"] Apr 17 20:17:02.764209 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:02.764182 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee130621_350a_49cf_905b_3a5917dcd327.slice/crio-b337a5439646bfdaaa90c30ee97068c7d3f1ead1ac854d1156128b3f26eec1e2 WatchSource:0}: Error finding container b337a5439646bfdaaa90c30ee97068c7d3f1ead1ac854d1156128b3f26eec1e2: Status 404 returned error can't find the container with id b337a5439646bfdaaa90c30ee97068c7d3f1ead1ac854d1156128b3f26eec1e2 Apr 17 20:17:02.844039 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.843998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7l5h\" (UniqueName: \"kubernetes.io/projected/ca2ad4aa-e46d-428a-8b10-9d150c00e450-kube-api-access-k7l5h\") pod \"network-check-source-8894fc9bd-tdbc5\" (UID: \"ca2ad4aa-e46d-428a-8b10-9d150c00e450\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" Apr 17 20:17:02.851686 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.851660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7l5h\" (UniqueName: \"kubernetes.io/projected/ca2ad4aa-e46d-428a-8b10-9d150c00e450-kube-api-access-k7l5h\") pod \"network-check-source-8894fc9bd-tdbc5\" (UID: \"ca2ad4aa-e46d-428a-8b10-9d150c00e450\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" Apr 17 20:17:02.931884 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.931838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" Apr 17 20:17:02.944807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.944775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:02.944951 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:02.944913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:02.945023 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.944949 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:02.945077 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.945040 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:03.945018363 +0000 UTC m=+93.609944048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:02.945077 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.945028 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:17:02.945077 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.945065 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f44f558f-sgqx4: secret "image-registry-tls" not found Apr 17 20:17:02.945238 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:02.945122 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls podName:6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec nodeName:}" failed. No retries permitted until 2026-04-17 20:17:03.945106785 +0000 UTC m=+93.610032473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls") pod "image-registry-54f44f558f-sgqx4" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec") : secret "image-registry-tls" not found Apr 17 20:17:03.051691 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.051648 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5"] Apr 17 20:17:03.054574 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:03.054538 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2ad4aa_e46d_428a_8b10_9d150c00e450.slice/crio-cd0b95eb90797c92310dc0ba5076f46c2f4b705514e43393a296d1637cdc0e86 WatchSource:0}: Error finding container cd0b95eb90797c92310dc0ba5076f46c2f4b705514e43393a296d1637cdc0e86: Status 404 returned error can't find the container with id cd0b95eb90797c92310dc0ba5076f46c2f4b705514e43393a296d1637cdc0e86 Apr 17 20:17:03.208893 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.208850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" event={"ID":"1e417380-f1cf-4d7a-b044-4fb0022ce22c","Type":"ContainerStarted","Data":"c7708b5da8f262671fc652fb0d1cb08d106e59998f3104057fc25ac2eb8ca603"} Apr 17 20:17:03.209911 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.209888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" event={"ID":"370bc716-d50c-4672-a0c1-cb7135fa9ce7","Type":"ContainerStarted","Data":"696a23d81d62cc2fc7e9098f09ef06bf60b312a425f6fd0e337d7f81d0ac573f"} Apr 17 20:17:03.211017 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.210990 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" event={"ID":"d7fe830e-bccc-4359-9b7c-afa06ecd5668","Type":"ContainerStarted","Data":"7c6cfc806f6ac7cd39218ebd5d8d7bc2574f099e69140a5672aa1841772fcc31"} Apr 17 20:17:03.212072 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.212049 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" event={"ID":"1b176a9f-c210-49e2-9286-ec8df6440b2b","Type":"ContainerStarted","Data":"30d6a4b41c36bfc4d5f80f0186c8aa91d91305413c36f7207f02570a172e5f94"} Apr 17 20:17:03.213473 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.213453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" event={"ID":"ca2ad4aa-e46d-428a-8b10-9d150c00e450","Type":"ContainerStarted","Data":"3f044be33c3924492ae93b4b5f8cbdc560dc70de0cc4d6240b08924be469f11c"} Apr 17 20:17:03.213549 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.213481 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" event={"ID":"ca2ad4aa-e46d-428a-8b10-9d150c00e450","Type":"ContainerStarted","Data":"cd0b95eb90797c92310dc0ba5076f46c2f4b705514e43393a296d1637cdc0e86"} Apr 17 20:17:03.214573 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.214553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gw79k" event={"ID":"ee130621-350a-49cf-905b-3a5917dcd327","Type":"ContainerStarted","Data":"b337a5439646bfdaaa90c30ee97068c7d3f1ead1ac854d1156128b3f26eec1e2"} Apr 17 20:17:03.228217 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.228123 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tdbc5" podStartSLOduration=1.228104435 podStartE2EDuration="1.228104435s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:03.227902511 +0000 UTC m=+92.892828202" watchObservedRunningTime="2026-04-17 20:17:03.228104435 +0000 UTC m=+92.893030121" Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.956922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:03.956996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:03.957196 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:03.957265 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:05.957245228 +0000 UTC m=+95.622170900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:03.957341 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:03.957365 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f44f558f-sgqx4: secret "image-registry-tls" not found Apr 17 20:17:03.957682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:03.957416 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls podName:6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec nodeName:}" failed. No retries permitted until 2026-04-17 20:17:05.957398707 +0000 UTC m=+95.622324395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls") pod "image-registry-54f44f558f-sgqx4" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec") : secret "image-registry-tls" not found Apr 17 20:17:05.973989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:05.973942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:05.974022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:05.974126 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:05.974153 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f44f558f-sgqx4: secret "image-registry-tls" not found Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:05.974183 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:05.974227 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls podName:6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec nodeName:}" failed. No retries permitted until 2026-04-17 20:17:09.974211104 +0000 UTC m=+99.639136773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls") pod "image-registry-54f44f558f-sgqx4" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec") : secret "image-registry-tls" not found Apr 17 20:17:05.974442 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:05.974242 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:09.974236195 +0000 UTC m=+99.639161863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:07.226675 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.226633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" event={"ID":"1e417380-f1cf-4d7a-b044-4fb0022ce22c","Type":"ContainerStarted","Data":"724cc691089f31918a1a6bd47509e71718197e749ffaf086547700072aace2f5"} Apr 17 20:17:07.228315 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.228281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" event={"ID":"370bc716-d50c-4672-a0c1-cb7135fa9ce7","Type":"ContainerStarted","Data":"8d9727088496740f7806894afb39bd9dc161b3366164015fd31bd8273e5c2923"} Apr 17 20:17:07.230021 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.229998 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/0.log" Apr 17 20:17:07.230139 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.230041 2579 generic.go:358] "Generic (PLEG): container finished" podID="d7fe830e-bccc-4359-9b7c-afa06ecd5668" containerID="812fcf427393cf34620ef44e9c7cd06fedb0b5987223247b925f782665fce955" exitCode=255 Apr 17 20:17:07.230203 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.230129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" event={"ID":"d7fe830e-bccc-4359-9b7c-afa06ecd5668","Type":"ContainerDied","Data":"812fcf427393cf34620ef44e9c7cd06fedb0b5987223247b925f782665fce955"} Apr 17 20:17:07.230327 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.230309 2579 scope.go:117] "RemoveContainer" containerID="812fcf427393cf34620ef44e9c7cd06fedb0b5987223247b925f782665fce955" Apr 17 20:17:07.231559 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.231518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" event={"ID":"1b176a9f-c210-49e2-9286-ec8df6440b2b","Type":"ContainerStarted","Data":"1078907eabad326152c329d5af37491acc8f6971beb90fccef20a6d86d90771d"} Apr 17 20:17:07.232908 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.232886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gw79k" event={"ID":"ee130621-350a-49cf-905b-3a5917dcd327","Type":"ContainerStarted","Data":"61fd8c685a75ef7404495a245576d0f8b7d70f957031376b4c1ade8b7e033c97"} Apr 17 20:17:07.241954 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.241899 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" podStartSLOduration=1.266926655 podStartE2EDuration="5.24188529s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:02.718466541 +0000 UTC m=+92.383392223" lastFinishedPulling="2026-04-17 20:17:06.693425179 +0000 UTC m=+96.358350858" observedRunningTime="2026-04-17 20:17:07.241492902 +0000 UTC m=+96.906418595" watchObservedRunningTime="2026-04-17 20:17:07.24188529 +0000 UTC m=+96.906810977" Apr 17 20:17:07.254630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.254580 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9k5w8" podStartSLOduration=1.18481824 podStartE2EDuration="5.254561194s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:02.610305759 +0000 UTC m=+92.275231441" lastFinishedPulling="2026-04-17 20:17:06.68004871 +0000 UTC m=+96.344974395" observedRunningTime="2026-04-17 20:17:07.254012133 +0000 UTC m=+96.918937829" watchObservedRunningTime="2026-04-17 20:17:07.254561194 +0000 UTC m=+96.919486885" Apr 17 20:17:07.272776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.272684 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" podStartSLOduration=1.335610862 podStartE2EDuration="5.272666399s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:02.750652035 +0000 UTC m=+92.415577707" lastFinishedPulling="2026-04-17 20:17:06.687707572 +0000 UTC m=+96.352633244" observedRunningTime="2026-04-17 20:17:07.271927273 +0000 UTC m=+96.936852964" watchObservedRunningTime="2026-04-17 20:17:07.272666399 +0000 UTC m=+96.937592091" Apr 17 20:17:07.287675 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.287581 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gw79k" podStartSLOduration=1.363962757 podStartE2EDuration="5.287563675s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:02.766048594 +0000 UTC m=+92.430974262" lastFinishedPulling="2026-04-17 20:17:06.689649512 +0000 UTC m=+96.354575180" observedRunningTime="2026-04-17 20:17:07.287088168 +0000 UTC m=+96.952013862" watchObservedRunningTime="2026-04-17 20:17:07.287563675 +0000 UTC m=+96.952489366" Apr 17 20:17:07.591577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.591491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:17:07.591727 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:07.591622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:17:07.591727 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:07.591636 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:17:07.591727 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:07.591710 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert podName:159a3a3e-608e-405f-ac09-ff7186a9c710 nodeName:}" failed. No retries permitted until 2026-04-17 20:18:11.591693682 +0000 UTC m=+161.256619350 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert") pod "ingress-canary-zvwgs" (UID: "159a3a3e-608e-405f-ac09-ff7186a9c710") : secret "canary-serving-cert" not found Apr 17 20:17:07.591852 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:07.591769 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:17:07.591852 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:07.591823 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls podName:a083116d-3c26-492c-b99a-c51bbaa51aa4 nodeName:}" failed. No retries permitted until 2026-04-17 20:18:11.591808761 +0000 UTC m=+161.256734435 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls") pod "dns-default-vkvbf" (UID: "a083116d-3c26-492c-b99a-c51bbaa51aa4") : secret "dns-default-metrics-tls" not found Apr 17 20:17:08.238680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.238587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:17:08.239785 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.239645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/0.log" Apr 17 20:17:08.239785 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.239686 2579 generic.go:358] "Generic (PLEG): container finished" podID="d7fe830e-bccc-4359-9b7c-afa06ecd5668" containerID="29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5" exitCode=255 Apr 17 20:17:08.243025 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.241168 2579 scope.go:117] "RemoveContainer" containerID="29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5" Apr 17 20:17:08.243025 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:08.241371 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8cv7h_openshift-console-operator(d7fe830e-bccc-4359-9b7c-afa06ecd5668)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" podUID="d7fe830e-bccc-4359-9b7c-afa06ecd5668" Apr 17 20:17:08.243025 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.241593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" event={"ID":"d7fe830e-bccc-4359-9b7c-afa06ecd5668","Type":"ContainerDied","Data":"29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5"} Apr 17 20:17:08.243025 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:08.241628 2579 scope.go:117] "RemoveContainer" containerID="812fcf427393cf34620ef44e9c7cd06fedb0b5987223247b925f782665fce955" Apr 17 20:17:09.243640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:09.243607 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:17:09.244135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:09.244012 2579 scope.go:117] "RemoveContainer" containerID="29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5" Apr 17 20:17:09.244200 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:09.244182 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8cv7h_openshift-console-operator(d7fe830e-bccc-4359-9b7c-afa06ecd5668)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" podUID="d7fe830e-bccc-4359-9b7c-afa06ecd5668" Apr 17 20:17:10.011892 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:10.011849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:10.011904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:10.012007 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:10.012015 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:10.012040 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f44f558f-sgqx4: secret "image-registry-tls" not found Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:10.012060 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:18.012046872 +0000 UTC m=+107.676972541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:10.012104 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:10.012093 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls podName:6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec nodeName:}" failed. No retries permitted until 2026-04-17 20:17:18.012075349 +0000 UTC m=+107.677001030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls") pod "image-registry-54f44f558f-sgqx4" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec") : secret "image-registry-tls" not found Apr 17 20:17:10.164603 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:10.164572 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-q6spr" Apr 17 20:17:11.236504 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:11.236468 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w9tk6_804b57c1-49b2-4e56-8da1-70a591e070e2/dns-node-resolver/0.log" Apr 17 20:17:11.835080 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:11.835052 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-msb4t_963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18/node-ca/0.log" Apr 17 20:17:12.483728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:12.483686 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:12.483728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:12.483730 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:12.484159 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:12.484104 2579 scope.go:117] "RemoveContainer" containerID="29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5" Apr 17 20:17:12.484288 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:12.484269 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8cv7h_openshift-console-operator(d7fe830e-bccc-4359-9b7c-afa06ecd5668)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" podUID="d7fe830e-bccc-4359-9b7c-afa06ecd5668" Apr 17 20:17:13.636722 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:13.636692 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tpfmp_1e417380-f1cf-4d7a-b044-4fb0022ce22c/kube-storage-version-migrator-operator/0.log" Apr 17 20:17:18.077496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:18.077446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:18.077917 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:18.077534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:18.077917 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:18.077677 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:18.077917 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:18.077779 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls podName:2c9bec13-455f-46f1-b0d0-62183c8c00c7 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:34.077738558 +0000 UTC m=+123.742664237 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d66dp" (UID: "2c9bec13-455f-46f1-b0d0-62183c8c00c7") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:17:18.079912 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:18.079893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"image-registry-54f44f558f-sgqx4\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:18.199822 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:18.199788 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:18.330095 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:18.329976 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:18.332549 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:18.332514 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc4f90c_a00c_4e6e_a420_a1b86b1c4fec.slice/crio-3fa50e9b93b507071423b41ab4193f1f892728ae7406d475442a2b69fecddc8e WatchSource:0}: Error finding container 3fa50e9b93b507071423b41ab4193f1f892728ae7406d475442a2b69fecddc8e: Status 404 returned error can't find the container with id 3fa50e9b93b507071423b41ab4193f1f892728ae7406d475442a2b69fecddc8e Apr 17 20:17:19.269622 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:19.269587 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" event={"ID":"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec","Type":"ContainerStarted","Data":"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36"} Apr 17 20:17:19.269622 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:19.269623 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" event={"ID":"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec","Type":"ContainerStarted","Data":"3fa50e9b93b507071423b41ab4193f1f892728ae7406d475442a2b69fecddc8e"} Apr 17 20:17:19.270047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:19.269649 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:19.287361 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:19.287254 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" podStartSLOduration=17.287234258 podStartE2EDuration="17.287234258s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:19.286842712 +0000 UTC m=+108.951768395" watchObservedRunningTime="2026-04-17 20:17:19.287234258 +0000 UTC m=+108.952159959" Apr 17 20:17:25.873652 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:25.873614 2579 scope.go:117] "RemoveContainer" containerID="29ad10430a09dc3a61a94dee5d7882857c14b2ac2c35c0ff1805adba68a793b5" Apr 17 20:17:26.290030 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:26.290004 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:17:26.290198 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:26.290061 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" event={"ID":"d7fe830e-bccc-4359-9b7c-afa06ecd5668","Type":"ContainerStarted","Data":"2364c2b0d1a0ca9950ffbf30739398c2e1ecee8189c6ac666ab71d6e2c6979ac"} Apr 17 20:17:26.290377 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:26.290346 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:26.306306 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:26.306254 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" podStartSLOduration=20.253827786 podStartE2EDuration="24.306240799s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:02.634847862 +0000 UTC m=+92.299773534" lastFinishedPulling="2026-04-17 20:17:06.687260865 +0000 UTC m=+96.352186547" observedRunningTime="2026-04-17 20:17:26.305009336 +0000 UTC m=+115.969935036" watchObservedRunningTime="2026-04-17 20:17:26.306240799 +0000 UTC m=+115.971166489" Apr 17 20:17:26.549250 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:26.549166 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8cv7h" Apr 17 20:17:30.358122 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.358092 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:30.370292 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.370261 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6hz2t"] Apr 17 20:17:30.373732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.373708 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c8d5fbf5c-sfqff"] Apr 17 20:17:30.373905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.373885 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.376413 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.376390 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-59lxf\"" Apr 17 20:17:30.376516 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.376390 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:17:30.376516 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.376452 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:17:30.376765 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.376728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.382172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.382148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7vs\" (UniqueName: \"kubernetes.io/projected/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-api-access-sv7vs\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.382347 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.382324 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.382479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.382391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-crio-socket\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.382479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.382447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-data-volume\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.382596 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.382476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.388565 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.388536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6hz2t"] Apr 17 20:17:30.389813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.389794 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c8d5fbf5c-sfqff"] Apr 17 20:17:30.483884 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-trusted-ca\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.483884 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-crio-socket\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484106 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-tls\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484106 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-data-volume\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484106 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484106 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-certificates\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484106 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.483991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-crio-socket\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7vs\" (UniqueName: \"kubernetes.io/projected/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-api-access-sv7vs\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-image-registry-private-configuration\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-installation-pull-secrets\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7fwm\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-kube-api-access-k7fwm\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-data-volume\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-bound-sa-token\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.484522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-ca-trust-extracted\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.484844 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.484801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.486487 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.486466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.499863 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.499839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7vs\" (UniqueName: \"kubernetes.io/projected/5167bfd3-1c9b-4daa-a7ca-08927f909b5f-kube-api-access-sv7vs\") pod \"insights-runtime-extractor-6hz2t\" (UID: \"5167bfd3-1c9b-4daa-a7ca-08927f909b5f\") " pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.585041 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7fwm\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-kube-api-access-k7fwm\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-bound-sa-token\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-ca-trust-extracted\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-trusted-ca\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-tls\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-certificates\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-image-registry-private-configuration\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-installation-pull-secrets\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.585628 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.585605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-ca-trust-extracted\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.586170 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.586092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-certificates\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.586289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.586228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-trusted-ca\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.587843 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.587824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-installation-pull-secrets\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.587921 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.587873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-image-registry-private-configuration\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.588102 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.588083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-registry-tls\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.592589 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.592566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-bound-sa-token\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.592711 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.592692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7fwm\" (UniqueName: \"kubernetes.io/projected/0c1607fb-6a2c-4213-8de7-34c392a4fd1c-kube-api-access-k7fwm\") pod \"image-registry-6c8d5fbf5c-sfqff\" (UID: \"0c1607fb-6a2c-4213-8de7-34c392a4fd1c\") " pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.686203 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.686169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6hz2t" Apr 17 20:17:30.692995 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.692967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:30.823166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.823132 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6hz2t"] Apr 17 20:17:30.826189 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:30.826157 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5167bfd3_1c9b_4daa_a7ca_08927f909b5f.slice/crio-3e98a8bec41a1f092b84d5b32338773647e83e66b9829d65fdd340e41063c1ac WatchSource:0}: Error finding container 3e98a8bec41a1f092b84d5b32338773647e83e66b9829d65fdd340e41063c1ac: Status 404 returned error can't find the container with id 3e98a8bec41a1f092b84d5b32338773647e83e66b9829d65fdd340e41063c1ac Apr 17 20:17:30.843586 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:30.843559 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c8d5fbf5c-sfqff"] Apr 17 20:17:30.846513 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:30.846482 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1607fb_6a2c_4213_8de7_34c392a4fd1c.slice/crio-f65def0b00ebe1e162a95e32c81abb67f3af407b079fa90dd9345b9da487259c WatchSource:0}: Error finding container f65def0b00ebe1e162a95e32c81abb67f3af407b079fa90dd9345b9da487259c: Status 404 returned error can't find the container with id f65def0b00ebe1e162a95e32c81abb67f3af407b079fa90dd9345b9da487259c Apr 17 20:17:31.304479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.304439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" event={"ID":"0c1607fb-6a2c-4213-8de7-34c392a4fd1c","Type":"ContainerStarted","Data":"3a559f1e3d7fe69bfd4ecb5ddc9730e68595927153b26509913e328171a5c487"} Apr 17 20:17:31.304479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.304482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" event={"ID":"0c1607fb-6a2c-4213-8de7-34c392a4fd1c","Type":"ContainerStarted","Data":"f65def0b00ebe1e162a95e32c81abb67f3af407b079fa90dd9345b9da487259c"} Apr 17 20:17:31.304702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.304583 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:17:31.305964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.305930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6hz2t" event={"ID":"5167bfd3-1c9b-4daa-a7ca-08927f909b5f","Type":"ContainerStarted","Data":"3f5296f58e6805dc49e906b5da695da632ee4e1ebf2dac4855c052fc64fabb85"} Apr 17 20:17:31.306084 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.305967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6hz2t" event={"ID":"5167bfd3-1c9b-4daa-a7ca-08927f909b5f","Type":"ContainerStarted","Data":"3e98a8bec41a1f092b84d5b32338773647e83e66b9829d65fdd340e41063c1ac"} Apr 17 20:17:31.321614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:31.321542 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podStartSLOduration=1.321523964 podStartE2EDuration="1.321523964s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:31.320698259 +0000 UTC m=+120.985623948" watchObservedRunningTime="2026-04-17 20:17:31.321523964 +0000 UTC m=+120.986449655" Apr 17 20:17:32.311006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:32.310964 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6hz2t" event={"ID":"5167bfd3-1c9b-4daa-a7ca-08927f909b5f","Type":"ContainerStarted","Data":"4243f3a4f4824be91f4a113953510057ee6e787b8f6490236cde3323e46f1660"} Apr 17 20:17:33.316735 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:33.316650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6hz2t" event={"ID":"5167bfd3-1c9b-4daa-a7ca-08927f909b5f","Type":"ContainerStarted","Data":"ff23ce59464069f389d3dafe843f254cf8a086d80a9c9a5ea911fa7201607b58"} Apr 17 20:17:33.333115 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:33.333062 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6hz2t" podStartSLOduration=1.226114935 podStartE2EDuration="3.33303957s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="2026-04-17 20:17:30.889197806 +0000 UTC m=+120.554123481" lastFinishedPulling="2026-04-17 20:17:32.996122447 +0000 UTC m=+122.661048116" observedRunningTime="2026-04-17 20:17:33.33204254 +0000 UTC m=+122.996968229" watchObservedRunningTime="2026-04-17 20:17:33.33303957 +0000 UTC m=+122.997965260" Apr 17 20:17:34.118844 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:34.118782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:34.121331 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:34.121300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9bec13-455f-46f1-b0d0-62183c8c00c7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d66dp\" (UID: \"2c9bec13-455f-46f1-b0d0-62183c8c00c7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:34.385980 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:34.385952 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hn2bx\"" Apr 17 20:17:34.393685 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:34.393664 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" Apr 17 20:17:34.509176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:34.509143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp"] Apr 17 20:17:34.512072 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:34.512041 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c9bec13_455f_46f1_b0d0_62183c8c00c7.slice/crio-11c24e9681d16da1ca5e9acc0fee99ed1697b4ea7ee43a22882fd35bb87abfe5 WatchSource:0}: Error finding container 11c24e9681d16da1ca5e9acc0fee99ed1697b4ea7ee43a22882fd35bb87abfe5: Status 404 returned error can't find the container with id 11c24e9681d16da1ca5e9acc0fee99ed1697b4ea7ee43a22882fd35bb87abfe5 Apr 17 20:17:35.322988 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:35.322951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" event={"ID":"2c9bec13-455f-46f1-b0d0-62183c8c00c7","Type":"ContainerStarted","Data":"11c24e9681d16da1ca5e9acc0fee99ed1697b4ea7ee43a22882fd35bb87abfe5"} Apr 17 20:17:36.939480 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:36.939439 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc"] Apr 17 20:17:36.942737 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:36.942699 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:36.944981 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:36.944957 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 20:17:36.945110 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:36.945092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-92gpt\"" Apr 17 20:17:36.952020 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:36.951995 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc"] Apr 17 20:17:37.040092 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.040046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qghqc\" (UID: \"7e5f305d-f8a2-4e23-8714-04f855b755fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:37.141526 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.141484 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qghqc\" (UID: \"7e5f305d-f8a2-4e23-8714-04f855b755fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:37.141714 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:37.141644 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:17:37.141825 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:37.141728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates podName:7e5f305d-f8a2-4e23-8714-04f855b755fb nodeName:}" failed. No retries permitted until 2026-04-17 20:17:37.641706949 +0000 UTC m=+127.306632617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-qghqc" (UID: "7e5f305d-f8a2-4e23-8714-04f855b755fb") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 20:17:37.328996 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.328909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" event={"ID":"2c9bec13-455f-46f1-b0d0-62183c8c00c7","Type":"ContainerStarted","Data":"d84b7dd36e94ea27f25207d8ee6fc8e51fb99dc541759c633cf0b6eedea3208c"} Apr 17 20:17:37.343229 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.343176 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d66dp" podStartSLOduration=33.393293622 podStartE2EDuration="35.343159607s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:34.513974423 +0000 UTC m=+124.178900092" lastFinishedPulling="2026-04-17 20:17:36.463840396 +0000 UTC m=+126.128766077" observedRunningTime="2026-04-17 20:17:37.343029699 +0000 UTC m=+127.007955392" watchObservedRunningTime="2026-04-17 20:17:37.343159607 +0000 UTC m=+127.008085297" Apr 17 20:17:37.644955 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.644920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qghqc\" (UID: \"7e5f305d-f8a2-4e23-8714-04f855b755fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:37.647592 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.647567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7e5f305d-f8a2-4e23-8714-04f855b755fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qghqc\" (UID: \"7e5f305d-f8a2-4e23-8714-04f855b755fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:37.852089 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.852046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:37.970341 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:37.970309 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc"] Apr 17 20:17:37.973142 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:37.973115 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e5f305d_f8a2_4e23_8714_04f855b755fb.slice/crio-b18de2eff8ea10913f48d50e00681b99e7fab10175ae336486ae9bcd19ce05be WatchSource:0}: Error finding container b18de2eff8ea10913f48d50e00681b99e7fab10175ae336486ae9bcd19ce05be: Status 404 returned error can't find the container with id b18de2eff8ea10913f48d50e00681b99e7fab10175ae336486ae9bcd19ce05be Apr 17 20:17:38.332121 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:38.332032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" event={"ID":"7e5f305d-f8a2-4e23-8714-04f855b755fb","Type":"ContainerStarted","Data":"b18de2eff8ea10913f48d50e00681b99e7fab10175ae336486ae9bcd19ce05be"} Apr 17 20:17:39.662312 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:39.662274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:17:39.664945 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:39.664919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcb80713-90b2-4ae8-95b5-a07c24ab45e2-metrics-certs\") pod \"network-metrics-daemon-842wl\" (UID: \"fcb80713-90b2-4ae8-95b5-a07c24ab45e2\") " pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:17:39.902323 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:39.902294 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:17:39.911000 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:39.910974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-842wl" Apr 17 20:17:40.027656 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.027517 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-842wl"] Apr 17 20:17:40.030142 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:40.030109 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb80713_90b2_4ae8_95b5_a07c24ab45e2.slice/crio-15a9e6943cbb5a5f0e00e512236fe3a37968c4b10098fbeb71643e10137eeb77 WatchSource:0}: Error finding container 15a9e6943cbb5a5f0e00e512236fe3a37968c4b10098fbeb71643e10137eeb77: Status 404 returned error can't find the container with id 15a9e6943cbb5a5f0e00e512236fe3a37968c4b10098fbeb71643e10137eeb77 Apr 17 20:17:40.338992 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.338899 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" event={"ID":"7e5f305d-f8a2-4e23-8714-04f855b755fb","Type":"ContainerStarted","Data":"cf46bdf06fe97627e5c0b3c655443e3f1daead45b9c112ec43edc627c072c465"} Apr 17 20:17:40.339167 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.339143 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:40.340007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.339971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-842wl" event={"ID":"fcb80713-90b2-4ae8-95b5-a07c24ab45e2","Type":"ContainerStarted","Data":"15a9e6943cbb5a5f0e00e512236fe3a37968c4b10098fbeb71643e10137eeb77"} Apr 17 20:17:40.343971 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.343952 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" Apr 17 20:17:40.352441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.352398 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qghqc" podStartSLOduration=2.811550136 podStartE2EDuration="4.352385367s" podCreationTimestamp="2026-04-17 20:17:36 +0000 UTC" firstStartedPulling="2026-04-17 20:17:37.975037982 +0000 UTC m=+127.639963651" lastFinishedPulling="2026-04-17 20:17:39.515873212 +0000 UTC m=+129.180798882" observedRunningTime="2026-04-17 20:17:40.351822345 +0000 UTC m=+130.016748035" watchObservedRunningTime="2026-04-17 20:17:40.352385367 +0000 UTC m=+130.017311057" Apr 17 20:17:40.363118 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.363082 2579 patch_prober.go:28] interesting pod/image-registry-54f44f558f-sgqx4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:17:40.363249 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.363131 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:17:40.995597 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:40.995561 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s82nl"] Apr 17 20:17:41.011995 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.011964 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s82nl"] Apr 17 20:17:41.012162 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.012102 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.014464 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.014437 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 20:17:41.014464 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.014450 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:17:41.014662 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.014457 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 20:17:41.014662 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.014497 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-k2bxl\"" Apr 17 20:17:41.075931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.075891 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.076098 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.075955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.076174 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.076105 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f033e518-f79c-4c74-9235-1a284adf65c0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.076174 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.076161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2q6l\" (UniqueName: \"kubernetes.io/projected/f033e518-f79c-4c74-9235-1a284adf65c0-kube-api-access-v2q6l\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.177067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.177021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f033e518-f79c-4c74-9235-1a284adf65c0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.177257 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.177080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2q6l\" (UniqueName: \"kubernetes.io/projected/f033e518-f79c-4c74-9235-1a284adf65c0-kube-api-access-v2q6l\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.177257 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.177129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.177257 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.177164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.177433 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:41.177268 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 20:17:41.177433 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:41.177344 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls podName:f033e518-f79c-4c74-9235-1a284adf65c0 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:41.677322978 +0000 UTC m=+131.342248648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-s82nl" (UID: "f033e518-f79c-4c74-9235-1a284adf65c0") : secret "prometheus-operator-tls" not found Apr 17 20:17:41.177855 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.177827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f033e518-f79c-4c74-9235-1a284adf65c0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.179886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.179864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.185822 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.185799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2q6l\" (UniqueName: \"kubernetes.io/projected/f033e518-f79c-4c74-9235-1a284adf65c0-kube-api-access-v2q6l\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.682253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.682214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.684768 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.684716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f033e518-f79c-4c74-9235-1a284adf65c0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s82nl\" (UID: \"f033e518-f79c-4c74-9235-1a284adf65c0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:41.925043 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:41.924995 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" Apr 17 20:17:42.041007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:42.040983 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s82nl"] Apr 17 20:17:42.043910 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:42.043878 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf033e518_f79c_4c74_9235_1a284adf65c0.slice/crio-8a695f5f905814203720a6482ee33456d1fcdbf04fd53e69abf9180b11ee66fd WatchSource:0}: Error finding container 8a695f5f905814203720a6482ee33456d1fcdbf04fd53e69abf9180b11ee66fd: Status 404 returned error can't find the container with id 8a695f5f905814203720a6482ee33456d1fcdbf04fd53e69abf9180b11ee66fd Apr 17 20:17:42.346957 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:42.346857 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-842wl" event={"ID":"fcb80713-90b2-4ae8-95b5-a07c24ab45e2","Type":"ContainerStarted","Data":"042aa15d6158a2245b73085ecd4cb72c99ea7d25903176a2931ee034ac65c4eb"} Apr 17 20:17:42.346957 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:42.346897 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-842wl" event={"ID":"fcb80713-90b2-4ae8-95b5-a07c24ab45e2","Type":"ContainerStarted","Data":"a06460f67bf10e7b286b1e45b02ac02a6019043a5796617271acda5d584941fe"} Apr 17 20:17:42.347961 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:42.347937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" event={"ID":"f033e518-f79c-4c74-9235-1a284adf65c0","Type":"ContainerStarted","Data":"8a695f5f905814203720a6482ee33456d1fcdbf04fd53e69abf9180b11ee66fd"} Apr 17 20:17:42.362075 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:42.362025 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-842wl" podStartSLOduration=130.055116995 podStartE2EDuration="2m11.362011107s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="2026-04-17 20:17:40.032140025 +0000 UTC m=+129.697065693" lastFinishedPulling="2026-04-17 20:17:41.339034134 +0000 UTC m=+131.003959805" observedRunningTime="2026-04-17 20:17:42.36037984 +0000 UTC m=+132.025305529" watchObservedRunningTime="2026-04-17 20:17:42.362011107 +0000 UTC m=+132.026936839" Apr 17 20:17:43.352702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:43.352669 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" event={"ID":"f033e518-f79c-4c74-9235-1a284adf65c0","Type":"ContainerStarted","Data":"f5ccd37d4c857b7aad58fbc7775f1af965d714b9be5c86301125a595af2e24a2"} Apr 17 20:17:44.357654 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:44.357622 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" event={"ID":"f033e518-f79c-4c74-9235-1a284adf65c0","Type":"ContainerStarted","Data":"ea488f498aea6cc7e693ce932d3b879138808ad3de3688c48873185d5cf5eff9"} Apr 17 20:17:44.372609 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:44.372548 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-s82nl" podStartSLOduration=3.148867417 podStartE2EDuration="4.372528335s" podCreationTimestamp="2026-04-17 20:17:40 +0000 UTC" firstStartedPulling="2026-04-17 20:17:42.045850569 +0000 UTC m=+131.710776250" lastFinishedPulling="2026-04-17 20:17:43.269511486 +0000 UTC m=+132.934437168" observedRunningTime="2026-04-17 20:17:44.371433179 +0000 UTC m=+134.036358869" watchObservedRunningTime="2026-04-17 20:17:44.372528335 +0000 UTC m=+134.037454027" Apr 17 20:17:46.334435 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.334395 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2qf44"] Apr 17 20:17:46.337697 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.337661 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.338601 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.338575 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9zsg7"] Apr 17 20:17:46.340654 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.340633 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:17:46.341522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.341500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-dshnq\"" Apr 17 20:17:46.341618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.341519 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 20:17:46.341618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.341550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 20:17:46.341721 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.341626 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.343480 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.343457 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:17:46.343710 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.343691 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:17:46.344107 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.344091 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kf58f\"" Apr 17 20:17:46.344311 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.344296 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:17:46.347650 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.347627 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2qf44"] Apr 17 20:17:46.423438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-metrics-client-ca\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-wtmp\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-root\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-tls\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zbc\" (UniqueName: \"kubernetes.io/projected/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-kube-api-access-c6zbc\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423649 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423641 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-textfile\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423774 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5a95a60-e2cc-428e-b995-a69225112a29-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln4kh\" (UniqueName: \"kubernetes.io/projected/a5a95a60-e2cc-428e-b995-a69225112a29-kube-api-access-ln4kh\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-sys\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.423923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423878 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.424154 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.423953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.524981 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.524944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-root\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.524981 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.524980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-tls\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zbc\" (UniqueName: \"kubernetes.io/projected/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-kube-api-access-c6zbc\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-root\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-textfile\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5a95a60-e2cc-428e-b995-a69225112a29-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln4kh\" (UniqueName: \"kubernetes.io/projected/a5a95a60-e2cc-428e-b995-a69225112a29-kube-api-access-ln4kh\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-sys\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525559 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-sys\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-textfile\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525819 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.525819 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.525819 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-metrics-client-ca\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526001 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.525979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.526069 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-wtmp\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526069 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526196 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-wtmp\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526281 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:46.526253 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 20:17:46.526425 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.526425 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:46.526405 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls podName:a5a95a60-e2cc-428e-b995-a69225112a29 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:47.02636942 +0000 UTC m=+136.691295091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-2qf44" (UID: "a5a95a60-e2cc-428e-b995-a69225112a29") : secret "kube-state-metrics-tls" not found Apr 17 20:17:46.526570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526508 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-metrics-client-ca\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526628 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.526809 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.526789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5a95a60-e2cc-428e-b995-a69225112a29-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.527971 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.527946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-tls\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.528078 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.527966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.528136 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.528119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.532277 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.532252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln4kh\" (UniqueName: \"kubernetes.io/projected/a5a95a60-e2cc-428e-b995-a69225112a29-kube-api-access-ln4kh\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:46.532474 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.532462 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zbc\" (UniqueName: \"kubernetes.io/projected/b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b-kube-api-access-c6zbc\") pod \"node-exporter-9zsg7\" (UID: \"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b\") " pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.655467 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:46.655429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9zsg7" Apr 17 20:17:46.664941 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:46.664873 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5556b_eed0_4b0d_bdad_d95dc24a9f4b.slice/crio-f0bf1e7e6196cba819ef4cb1e238a99523535acff9c7b6c8bc0d5cc3b8a0d55c WatchSource:0}: Error finding container f0bf1e7e6196cba819ef4cb1e238a99523535acff9c7b6c8bc0d5cc3b8a0d55c: Status 404 returned error can't find the container with id f0bf1e7e6196cba819ef4cb1e238a99523535acff9c7b6c8bc0d5cc3b8a0d55c Apr 17 20:17:47.029872 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:47.029791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:47.032265 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:47.032237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5a95a60-e2cc-428e-b995-a69225112a29-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2qf44\" (UID: \"a5a95a60-e2cc-428e-b995-a69225112a29\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:47.250128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:47.250093 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" Apr 17 20:17:47.372862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:47.372798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zsg7" event={"ID":"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b","Type":"ContainerStarted","Data":"f0bf1e7e6196cba819ef4cb1e238a99523535acff9c7b6c8bc0d5cc3b8a0d55c"} Apr 17 20:17:47.411492 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:47.411411 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2qf44"] Apr 17 20:17:47.582535 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:47.582460 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a95a60_e2cc_428e_b995_a69225112a29.slice/crio-0af2ead60164e67d6ce14b4a53c74765191ddf182f70eab8b7ecb8f01632c76a WatchSource:0}: Error finding container 0af2ead60164e67d6ce14b4a53c74765191ddf182f70eab8b7ecb8f01632c76a: Status 404 returned error can't find the container with id 0af2ead60164e67d6ce14b4a53c74765191ddf182f70eab8b7ecb8f01632c76a Apr 17 20:17:48.307064 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.307026 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c55f7d6c7-bb687"] Apr 17 20:17:48.312871 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.312838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.315538 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315423 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 20:17:48.315538 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315501 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 20:17:48.315769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315554 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gl8sq\"" Apr 17 20:17:48.315769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315501 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 20:17:48.315769 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315632 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 20:17:48.315939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315806 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-20hu07ammosoc\"" Apr 17 20:17:48.315939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.315886 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 20:17:48.320438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.320410 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c55f7d6c7-bb687"] Apr 17 20:17:48.377270 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.377232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" event={"ID":"a5a95a60-e2cc-428e-b995-a69225112a29","Type":"ContainerStarted","Data":"0af2ead60164e67d6ce14b4a53c74765191ddf182f70eab8b7ecb8f01632c76a"} Apr 17 20:17:48.378861 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.378834 2579 generic.go:358] "Generic (PLEG): container finished" podID="b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b" containerID="5fee9a76ae5653bf8658e43dea28392d286481c7819b67629706544080adc11f" exitCode=0 Apr 17 20:17:48.378986 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.378906 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zsg7" event={"ID":"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b","Type":"ContainerDied","Data":"5fee9a76ae5653bf8658e43dea28392d286481c7819b67629706544080adc11f"} Apr 17 20:17:48.442450 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-grpc-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.442799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.442799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.442984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae81404b-bc25-469b-be3f-d5f02eb9709a-metrics-client-ca\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.442984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrbr\" (UniqueName: \"kubernetes.io/projected/ae81404b-bc25-469b-be3f-d5f02eb9709a-kube-api-access-jmrbr\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.442984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.443125 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.442991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.443125 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.443030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544440 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-grpc-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae81404b-bc25-469b-be3f-d5f02eb9709a-metrics-client-ca\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrbr\" (UniqueName: \"kubernetes.io/projected/ae81404b-bc25-469b-be3f-d5f02eb9709a-kube-api-access-jmrbr\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.544823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.545036 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.544828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.545648 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.545571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae81404b-bc25-469b-be3f-d5f02eb9709a-metrics-client-ca\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.547955 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.547932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.548059 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.548005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.548112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.548067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.548219 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.548194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.548565 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.548526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-thanos-querier-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.550125 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.550100 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae81404b-bc25-469b-be3f-d5f02eb9709a-secret-grpc-tls\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.551815 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.551785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrbr\" (UniqueName: \"kubernetes.io/projected/ae81404b-bc25-469b-be3f-d5f02eb9709a-kube-api-access-jmrbr\") pod \"thanos-querier-5c55f7d6c7-bb687\" (UID: \"ae81404b-bc25-469b-be3f-d5f02eb9709a\") " pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.624798 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.624689 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:48.893038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:48.892988 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c55f7d6c7-bb687"] Apr 17 20:17:48.896824 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:48.896793 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae81404b_bc25_469b_be3f_d5f02eb9709a.slice/crio-a6a3beaeed4d672aef97d62b5cfb96c0ac4e920cfd2bfdabde4f28ebd3042133 WatchSource:0}: Error finding container a6a3beaeed4d672aef97d62b5cfb96c0ac4e920cfd2bfdabde4f28ebd3042133: Status 404 returned error can't find the container with id a6a3beaeed4d672aef97d62b5cfb96c0ac4e920cfd2bfdabde4f28ebd3042133 Apr 17 20:17:49.383479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.383439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" event={"ID":"a5a95a60-e2cc-428e-b995-a69225112a29","Type":"ContainerStarted","Data":"67028e4c9def50fa7efc617dbf8d91318d6a6afdc7ffb825a4ed3e260bfa9252"} Apr 17 20:17:49.383479 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.383486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" event={"ID":"a5a95a60-e2cc-428e-b995-a69225112a29","Type":"ContainerStarted","Data":"137e3c49c774dd0a31ff59bd99814be4c998c53d1614339eee4c8da3a0c62a56"} Apr 17 20:17:49.384021 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.383501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" event={"ID":"a5a95a60-e2cc-428e-b995-a69225112a29","Type":"ContainerStarted","Data":"f900e7deb798a76e8bc9d27a17abb1017e32901e10350eeb09e76e646225aa0e"} Apr 17 20:17:49.384656 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.384630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"a6a3beaeed4d672aef97d62b5cfb96c0ac4e920cfd2bfdabde4f28ebd3042133"} Apr 17 20:17:49.386522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.386501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zsg7" event={"ID":"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b","Type":"ContainerStarted","Data":"3a16db2fa45ea2d0f6fbe7482b6d6dfa9cc179df3e995b68a798d66cdf00153b"} Apr 17 20:17:49.386615 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.386526 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9zsg7" event={"ID":"b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b","Type":"ContainerStarted","Data":"c53b3c7c6b857388911330ce75a67a69431848b1a0b4c8b79e5f60d4f2f79af2"} Apr 17 20:17:49.401782 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.401713 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-2qf44" podStartSLOduration=2.171022096 podStartE2EDuration="3.401700576s" podCreationTimestamp="2026-04-17 20:17:46 +0000 UTC" firstStartedPulling="2026-04-17 20:17:47.584322998 +0000 UTC m=+137.249248670" lastFinishedPulling="2026-04-17 20:17:48.815001469 +0000 UTC m=+138.479927150" observedRunningTime="2026-04-17 20:17:49.399518634 +0000 UTC m=+139.064444325" watchObservedRunningTime="2026-04-17 20:17:49.401700576 +0000 UTC m=+139.066626267" Apr 17 20:17:49.421962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:49.421914 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9zsg7" podStartSLOduration=2.454174057 podStartE2EDuration="3.421875219s" podCreationTimestamp="2026-04-17 20:17:46 +0000 UTC" firstStartedPulling="2026-04-17 20:17:46.666643574 +0000 UTC m=+136.331569241" lastFinishedPulling="2026-04-17 20:17:47.634344717 +0000 UTC m=+137.299270403" observedRunningTime="2026-04-17 20:17:49.420791075 +0000 UTC m=+139.085716766" watchObservedRunningTime="2026-04-17 20:17:49.421875219 +0000 UTC m=+139.086800964" Apr 17 20:17:50.362496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:50.362452 2579 patch_prober.go:28] interesting pod/image-registry-54f44f558f-sgqx4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:17:50.362690 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:50.362504 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:17:50.698368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:50.698329 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:17:50.698853 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:50.698394 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:17:51.393920 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:51.393885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"9c36fa2ceeb82a0065a6793b5b0f01411c243ba02c2bcf6b6fe353f492569d65"} Apr 17 20:17:51.394081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:51.393929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"ef294648d7e96161dcd60a6bad28cb9c7cc568eba88d840d085bad8e945e2888"} Apr 17 20:17:51.394081 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:51.393943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"f472ee6706099b28c67baed5615de160c3c15b2719d8ee2045e7152c40b6f7c0"} Apr 17 20:17:52.316708 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.316653 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:17:52.317047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.316732 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:17:52.401070 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.401015 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"c539ac6f71f3e510cdc1c259159d8f0d4413aa330f1ca5dd54b458bf264d952d"} Apr 17 20:17:52.401070 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.401072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"ab76d92a3ee3fb3b82a8f24cc0ef7c6cdc58536628d13ca51755a9d55807c5c4"} Apr 17 20:17:52.401368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.401090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" event={"ID":"ae81404b-bc25-469b-be3f-d5f02eb9709a","Type":"ContainerStarted","Data":"0347995810a4ca7f48120bb78364732993156a58d8e1849dd5218a32fbc2159b"} Apr 17 20:17:52.401368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.401231 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:52.423770 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:52.423688 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" podStartSLOduration=1.214622624 podStartE2EDuration="4.423670204s" podCreationTimestamp="2026-04-17 20:17:48 +0000 UTC" firstStartedPulling="2026-04-17 20:17:48.898828571 +0000 UTC m=+138.563754243" lastFinishedPulling="2026-04-17 20:17:52.107876152 +0000 UTC m=+141.772801823" observedRunningTime="2026-04-17 20:17:52.421239644 +0000 UTC m=+142.086165373" watchObservedRunningTime="2026-04-17 20:17:52.423670204 +0000 UTC m=+142.088595895" Apr 17 20:17:53.897709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.897671 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:17:53.900980 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.900958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.903325 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.903306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 20:17:53.904197 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904173 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:17:53.904281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904200 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hzws2\"" Apr 17 20:17:53.904281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904213 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 20:17:53.904281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904226 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:17:53.904453 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904433 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:17:53.904492 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904472 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:17:53.904587 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.904571 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:17:53.908065 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.908047 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 20:17:53.910959 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.910938 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:17:53.996895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.996851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdfh\" (UniqueName: \"kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.996919 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.996966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.996994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.997020 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.997043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:53.997076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:53.997069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098471 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098471 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdfh\" (UniqueName: \"kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.098709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.098662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.099359 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.099328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.099470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.099374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.099470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.099340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.099560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.099439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.101243 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.101217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.101331 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.101246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.105677 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.105654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdfh\" (UniqueName: \"kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh\") pod \"console-6c97464997-lpw7s\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.210703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.210614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:17:54.346089 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.346053 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:17:54.350209 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:54.350173 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16642ce0_f944_451f_8118_544de2e0acda.slice/crio-ba41afb63232a2527f632c122afdd07f7677c93bfa19ab0f3d84b4ff43b80f83 WatchSource:0}: Error finding container ba41afb63232a2527f632c122afdd07f7677c93bfa19ab0f3d84b4ff43b80f83: Status 404 returned error can't find the container with id ba41afb63232a2527f632c122afdd07f7677c93bfa19ab0f3d84b4ff43b80f83 Apr 17 20:17:54.408737 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:54.408690 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c97464997-lpw7s" event={"ID":"16642ce0-f944-451f-8118-544de2e0acda","Type":"ContainerStarted","Data":"ba41afb63232a2527f632c122afdd07f7677c93bfa19ab0f3d84b4ff43b80f83"} Apr 17 20:17:55.379257 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.379110 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" containerID="cri-o://69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36" gracePeriod=30 Apr 17 20:17:55.630284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.630211 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:55.725456 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725423 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725456 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725470 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725536 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725565 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbqv\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725596 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725679 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725913 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725716 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.725913 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.725740 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token\") pod \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\" (UID: \"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec\") " Apr 17 20:17:55.726072 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.726044 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:17:55.726313 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.726265 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:17:55.728848 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.728820 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv" (OuterVolumeSpecName: "kube-api-access-zfbqv") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "kube-api-access-zfbqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:17:55.728941 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.728858 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:17:55.729024 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.729004 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:17:55.729096 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.729039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:17:55.729270 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.729247 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:17:55.737110 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.737079 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" (UID: "6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:17:55.827235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827202 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-ca-trust-extracted\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827231 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zfbqv\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-kube-api-access-zfbqv\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827241 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-tls\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827250 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-registry-certificates\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827259 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-image-registry-private-configuration\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827268 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-bound-sa-token\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827281 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-trusted-ca\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:55.827470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:55.827294 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec-installation-pull-secrets\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:17:56.415687 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.415646 2579 generic.go:358] "Generic (PLEG): container finished" podID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerID="69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36" exitCode=0 Apr 17 20:17:56.416183 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.415716 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" Apr 17 20:17:56.416183 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.415756 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" event={"ID":"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec","Type":"ContainerDied","Data":"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36"} Apr 17 20:17:56.416183 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.415808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f44f558f-sgqx4" event={"ID":"6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec","Type":"ContainerDied","Data":"3fa50e9b93b507071423b41ab4193f1f892728ae7406d475442a2b69fecddc8e"} Apr 17 20:17:56.416183 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.415831 2579 scope.go:117] "RemoveContainer" containerID="69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36" Apr 17 20:17:56.439446 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.439411 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:56.442906 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.442875 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-54f44f558f-sgqx4"] Apr 17 20:17:56.879861 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:56.879819 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" path="/var/lib/kubelet/pods/6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec/volumes" Apr 17 20:17:57.056087 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.056061 2579 scope.go:117] "RemoveContainer" containerID="69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36" Apr 17 20:17:57.056458 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:17:57.056432 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36\": container with ID starting with 69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36 not found: ID does not exist" containerID="69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36" Apr 17 20:17:57.056579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.056465 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36"} err="failed to get container status \"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36\": rpc error: code = NotFound desc = could not find container \"69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36\": container with ID starting with 69a83ad60813db82cf65643d237f729ed455c1691a220559211832a9715d4f36 not found: ID does not exist" Apr 17 20:17:57.421019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.420925 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c97464997-lpw7s" event={"ID":"16642ce0-f944-451f-8118-544de2e0acda","Type":"ContainerStarted","Data":"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12"} Apr 17 20:17:57.437412 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.437353 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c97464997-lpw7s" podStartSLOduration=1.683493543 podStartE2EDuration="4.437334781s" podCreationTimestamp="2026-04-17 20:17:53 +0000 UTC" firstStartedPulling="2026-04-17 20:17:54.352022436 +0000 UTC m=+144.016948105" lastFinishedPulling="2026-04-17 20:17:57.105863665 +0000 UTC m=+146.770789343" observedRunningTime="2026-04-17 20:17:57.436055948 +0000 UTC m=+147.100981638" watchObservedRunningTime="2026-04-17 20:17:57.437334781 +0000 UTC m=+147.102260471" Apr 17 20:17:57.668488 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.668454 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:17:57.668808 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.668795 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" Apr 17 20:17:57.668859 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.668810 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" Apr 17 20:17:57.668895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.668860 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cc4f90c-a00c-4e6e-a420-a1b86b1c4fec" containerName="registry" Apr 17 20:17:57.671837 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.671792 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.680260 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.680235 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:17:57.850403 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cht5z\" (UniqueName: \"kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850418 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850705 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.850705 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.850636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.951984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952263 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952263 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952335 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952373 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cht5z\" (UniqueName: \"kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952486 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952868 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.952993 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.953038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.953038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.952994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.954885 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.954863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.955049 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.955032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.959577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.959561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cht5z\" (UniqueName: \"kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z\") pod \"console-56d57c576f-6k4pn\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:57.981623 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:57.981594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:17:58.108437 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:58.108412 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:17:58.110999 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:17:58.110972 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26a743e_8bd2_4496_be1d_a5a24a32f42a.slice/crio-1ce180312e456604ac8f12063a093b3e5526688c153ac78e419a1d29aee31add WatchSource:0}: Error finding container 1ce180312e456604ac8f12063a093b3e5526688c153ac78e419a1d29aee31add: Status 404 returned error can't find the container with id 1ce180312e456604ac8f12063a093b3e5526688c153ac78e419a1d29aee31add Apr 17 20:17:58.410886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:58.410859 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c55f7d6c7-bb687" Apr 17 20:17:58.425113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:58.425078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d57c576f-6k4pn" event={"ID":"c26a743e-8bd2-4496-be1d-a5a24a32f42a","Type":"ContainerStarted","Data":"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e"} Apr 17 20:17:58.425113 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:58.425115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d57c576f-6k4pn" event={"ID":"c26a743e-8bd2-4496-be1d-a5a24a32f42a","Type":"ContainerStarted","Data":"1ce180312e456604ac8f12063a093b3e5526688c153ac78e419a1d29aee31add"} Apr 17 20:17:58.445813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:17:58.445727 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56d57c576f-6k4pn" podStartSLOduration=1.445709032 podStartE2EDuration="1.445709032s" podCreationTimestamp="2026-04-17 20:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:58.444148354 +0000 UTC m=+148.109074047" watchObservedRunningTime="2026-04-17 20:17:58.445709032 +0000 UTC m=+148.110634723" Apr 17 20:18:00.696696 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:00.696665 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:18:00.697084 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:00.696715 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:18:02.315544 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:02.315510 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:18:02.315944 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:02.315562 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:18:04.211448 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:04.211392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:04.211448 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:04.211459 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:04.216252 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:04.216229 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:04.446371 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:04.446335 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:06.722942 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:18:06.722892 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vkvbf" podUID="a083116d-3c26-492c-b99a-c51bbaa51aa4" Apr 17 20:18:06.740223 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:18:06.740179 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zvwgs" podUID="159a3a3e-608e-405f-ac09-ff7186a9c710" Apr 17 20:18:07.450672 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:07.450635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:07.982108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:07.982018 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:18:07.982452 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:07.982110 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:18:07.986779 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:07.986753 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:18:08.457326 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:08.457291 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:18:08.500169 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:08.500136 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:18:10.697131 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.697097 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:18:10.697600 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.697157 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:18:10.697600 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.697207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:18:10.697838 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.697807 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"3a559f1e3d7fe69bfd4ecb5ddc9730e68595927153b26509913e328171a5c487"} pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" containerMessage="Container registry failed liveness probe, will be restarted" Apr 17 20:18:10.701651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.701622 2579 patch_prober.go:28] interesting pod/image-registry-6c8d5fbf5c-sfqff container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:18:10.701809 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:10.701682 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:18:11.676931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.676889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:11.676931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.676943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:18:11.679459 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.679430 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a083116d-3c26-492c-b99a-c51bbaa51aa4-metrics-tls\") pod \"dns-default-vkvbf\" (UID: \"a083116d-3c26-492c-b99a-c51bbaa51aa4\") " pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:11.679674 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.679652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/159a3a3e-608e-405f-ac09-ff7186a9c710-cert\") pod \"ingress-canary-zvwgs\" (UID: \"159a3a3e-608e-405f-ac09-ff7186a9c710\") " pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:18:11.953944 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.953854 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:18:11.962089 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:11.962062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:12.089550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:12.089524 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkvbf"] Apr 17 20:18:12.091976 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:18:12.091932 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda083116d_3c26_492c_b99a_c51bbaa51aa4.slice/crio-4e0423812811ba61fac5331f440343c714d136be5729ce045b1cc1d4d2d4e4b4 WatchSource:0}: Error finding container 4e0423812811ba61fac5331f440343c714d136be5729ce045b1cc1d4d2d4e4b4: Status 404 returned error can't find the container with id 4e0423812811ba61fac5331f440343c714d136be5729ce045b1cc1d4d2d4e4b4 Apr 17 20:18:12.466889 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:12.466852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkvbf" event={"ID":"a083116d-3c26-492c-b99a-c51bbaa51aa4","Type":"ContainerStarted","Data":"4e0423812811ba61fac5331f440343c714d136be5729ce045b1cc1d4d2d4e4b4"} Apr 17 20:18:14.475166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:14.475132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkvbf" event={"ID":"a083116d-3c26-492c-b99a-c51bbaa51aa4","Type":"ContainerStarted","Data":"ee1f6a5b42c33497d3695eff8d708265f10960dbccef789060d2286e3df59952"} Apr 17 20:18:14.475166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:14.475169 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkvbf" event={"ID":"a083116d-3c26-492c-b99a-c51bbaa51aa4","Type":"ContainerStarted","Data":"39191ff49c5dcbf37ecb81f01e2d7d454cb2f8cf6e0739f99790cb609c832737"} Apr 17 20:18:14.475582 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:14.475195 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:14.496038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:14.495988 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vkvbf" podStartSLOduration=130.000982032 podStartE2EDuration="2m11.495971677s" podCreationTimestamp="2026-04-17 20:16:03 +0000 UTC" firstStartedPulling="2026-04-17 20:18:12.09393314 +0000 UTC m=+161.758858810" lastFinishedPulling="2026-04-17 20:18:13.588922787 +0000 UTC m=+163.253848455" observedRunningTime="2026-04-17 20:18:14.495022955 +0000 UTC m=+164.159948658" watchObservedRunningTime="2026-04-17 20:18:14.495971677 +0000 UTC m=+164.160897366" Apr 17 20:18:17.486031 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:17.485946 2579 generic.go:358] "Generic (PLEG): container finished" podID="370bc716-d50c-4672-a0c1-cb7135fa9ce7" containerID="8d9727088496740f7806894afb39bd9dc161b3366164015fd31bd8273e5c2923" exitCode=0 Apr 17 20:18:17.486031 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:17.485998 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" event={"ID":"370bc716-d50c-4672-a0c1-cb7135fa9ce7","Type":"ContainerDied","Data":"8d9727088496740f7806894afb39bd9dc161b3366164015fd31bd8273e5c2923"} Apr 17 20:18:17.486476 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:17.486300 2579 scope.go:117] "RemoveContainer" containerID="8d9727088496740f7806894afb39bd9dc161b3366164015fd31bd8273e5c2923" Apr 17 20:18:18.489882 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:18.489848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wqz45" event={"ID":"370bc716-d50c-4672-a0c1-cb7135fa9ce7","Type":"ContainerStarted","Data":"18f3d5f5c467e361a8913c2a8f1917589bd8245ef39c81587d84d0ecd0a41a4a"} Apr 17 20:18:18.873982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:18.873896 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:18:18.876264 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:18.876243 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:18:18.884880 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:18.884856 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zvwgs" Apr 17 20:18:19.009323 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:19.009287 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zvwgs"] Apr 17 20:18:19.012442 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:18:19.012412 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159a3a3e_608e_405f_ac09_ff7186a9c710.slice/crio-5fde6bd1d00d56bad2121a7373d05ec14a71c375d5e38e1057ca9c4d9061b0ee WatchSource:0}: Error finding container 5fde6bd1d00d56bad2121a7373d05ec14a71c375d5e38e1057ca9c4d9061b0ee: Status 404 returned error can't find the container with id 5fde6bd1d00d56bad2121a7373d05ec14a71c375d5e38e1057ca9c4d9061b0ee Apr 17 20:18:19.494931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:19.494889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zvwgs" event={"ID":"159a3a3e-608e-405f-ac09-ff7186a9c710","Type":"ContainerStarted","Data":"5fde6bd1d00d56bad2121a7373d05ec14a71c375d5e38e1057ca9c4d9061b0ee"} Apr 17 20:18:20.702184 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:20.702160 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:18:21.503115 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:21.503075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zvwgs" event={"ID":"159a3a3e-608e-405f-ac09-ff7186a9c710","Type":"ContainerStarted","Data":"8c8e6cd3e5f933157de15de07448d358ffb25eaf6142859a03b3ac75876b39d2"} Apr 17 20:18:21.518728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:21.518664 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zvwgs" podStartSLOduration=136.885858378 podStartE2EDuration="2m18.518644349s" podCreationTimestamp="2026-04-17 20:16:03 +0000 UTC" firstStartedPulling="2026-04-17 20:18:19.014257996 +0000 UTC m=+168.679183664" lastFinishedPulling="2026-04-17 20:18:20.647043953 +0000 UTC m=+170.311969635" observedRunningTime="2026-04-17 20:18:21.51671553 +0000 UTC m=+171.181641221" watchObservedRunningTime="2026-04-17 20:18:21.518644349 +0000 UTC m=+171.183570040" Apr 17 20:18:23.510523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:23.510484 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e417380-f1cf-4d7a-b044-4fb0022ce22c" containerID="724cc691089f31918a1a6bd47509e71718197e749ffaf086547700072aace2f5" exitCode=0 Apr 17 20:18:23.510948 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:23.510535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" event={"ID":"1e417380-f1cf-4d7a-b044-4fb0022ce22c","Type":"ContainerDied","Data":"724cc691089f31918a1a6bd47509e71718197e749ffaf086547700072aace2f5"} Apr 17 20:18:23.510948 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:23.510940 2579 scope.go:117] "RemoveContainer" containerID="724cc691089f31918a1a6bd47509e71718197e749ffaf086547700072aace2f5" Apr 17 20:18:24.480501 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:24.480472 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vkvbf" Apr 17 20:18:24.515336 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:24.515306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tpfmp" event={"ID":"1e417380-f1cf-4d7a-b044-4fb0022ce22c","Type":"ContainerStarted","Data":"79628e2c08eea2a5df757e657111f4e75a5fd33fd39dea14c50846d02351e17e"} Apr 17 20:18:33.520332 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.520266 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c97464997-lpw7s" podUID="16642ce0-f944-451f-8118-544de2e0acda" containerName="console" containerID="cri-o://af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12" gracePeriod=15 Apr 17 20:18:33.761210 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.761182 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c97464997-lpw7s_16642ce0-f944-451f-8118-544de2e0acda/console/0.log" Apr 17 20:18:33.761330 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.761247 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:33.878703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.878670 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.878896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.878726 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.878896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.878781 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.878896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.878812 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.878896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.878864 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdfh\" (UniqueName: \"kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.879100 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879017 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.879100 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879065 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config\") pod \"16642ce0-f944-451f-8118-544de2e0acda\" (UID: \"16642ce0-f944-451f-8118-544de2e0acda\") " Apr 17 20:18:33.879236 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879123 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:33.879317 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879250 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca" (OuterVolumeSpecName: "service-ca") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:33.879372 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879322 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:33.879476 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879456 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-oauth-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.879546 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879482 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-service-ca\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.879546 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.879496 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config" (OuterVolumeSpecName: "console-config") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:33.881308 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.881283 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:33.881308 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.881291 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh" (OuterVolumeSpecName: "kube-api-access-tzdfh") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "kube-api-access-tzdfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:33.881399 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.881367 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "16642ce0-f944-451f-8118-544de2e0acda" (UID: "16642ce0-f944-451f-8118-544de2e0acda"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:33.981007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.980965 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-oauth-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.981007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.981001 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-trusted-ca-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.981007 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.981014 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzdfh\" (UniqueName: \"kubernetes.io/projected/16642ce0-f944-451f-8118-544de2e0acda-kube-api-access-tzdfh\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.981245 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.981030 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16642ce0-f944-451f-8118-544de2e0acda-console-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:33.981245 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:33.981043 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16642ce0-f944-451f-8118-544de2e0acda-console-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:18:34.547733 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547700 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c97464997-lpw7s_16642ce0-f944-451f-8118-544de2e0acda/console/0.log" Apr 17 20:18:34.548152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547763 2579 generic.go:358] "Generic (PLEG): container finished" podID="16642ce0-f944-451f-8118-544de2e0acda" containerID="af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12" exitCode=2 Apr 17 20:18:34.548152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547831 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c97464997-lpw7s" Apr 17 20:18:34.548152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c97464997-lpw7s" event={"ID":"16642ce0-f944-451f-8118-544de2e0acda","Type":"ContainerDied","Data":"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12"} Apr 17 20:18:34.548152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c97464997-lpw7s" event={"ID":"16642ce0-f944-451f-8118-544de2e0acda","Type":"ContainerDied","Data":"ba41afb63232a2527f632c122afdd07f7677c93bfa19ab0f3d84b4ff43b80f83"} Apr 17 20:18:34.548152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.547959 2579 scope.go:117] "RemoveContainer" containerID="af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12" Apr 17 20:18:34.556608 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.556586 2579 scope.go:117] "RemoveContainer" containerID="af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12" Apr 17 20:18:34.556965 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:18:34.556945 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12\": container with ID starting with af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12 not found: ID does not exist" containerID="af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12" Apr 17 20:18:34.557031 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.556974 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12"} err="failed to get container status \"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12\": rpc error: code = NotFound desc = could not find container \"af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12\": container with ID starting with af6d3a00cd59d8c782f57c3e195c151b0b34a32412afc927a920724994d92c12 not found: ID does not exist" Apr 17 20:18:34.568539 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.568506 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:18:34.569968 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.569941 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c97464997-lpw7s"] Apr 17 20:18:34.877233 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:34.877208 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16642ce0-f944-451f-8118-544de2e0acda" path="/var/lib/kubelet/pods/16642ce0-f944-451f-8118-544de2e0acda/volumes" Apr 17 20:18:35.717253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:35.717213 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" podUID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerName="registry" containerID="cri-o://3a559f1e3d7fe69bfd4ecb5ddc9730e68595927153b26509913e328171a5c487" gracePeriod=30 Apr 17 20:18:37.558982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:37.558944 2579 generic.go:358] "Generic (PLEG): container finished" podID="0c1607fb-6a2c-4213-8de7-34c392a4fd1c" containerID="3a559f1e3d7fe69bfd4ecb5ddc9730e68595927153b26509913e328171a5c487" exitCode=0 Apr 17 20:18:37.559349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:37.558989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" event={"ID":"0c1607fb-6a2c-4213-8de7-34c392a4fd1c","Type":"ContainerDied","Data":"3a559f1e3d7fe69bfd4ecb5ddc9730e68595927153b26509913e328171a5c487"} Apr 17 20:18:37.559349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:37.559017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" event={"ID":"0c1607fb-6a2c-4213-8de7-34c392a4fd1c","Type":"ContainerStarted","Data":"61bd3b57fbb428a7ab542140836c255727befa68899e378397714dc5e7581317"} Apr 17 20:18:37.559349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:37.559120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:18:38.563381 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:38.563344 2579 generic.go:358] "Generic (PLEG): container finished" podID="ee130621-350a-49cf-905b-3a5917dcd327" containerID="61fd8c685a75ef7404495a245576d0f8b7d70f957031376b4c1ade8b7e033c97" exitCode=0 Apr 17 20:18:38.563767 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:38.563423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gw79k" event={"ID":"ee130621-350a-49cf-905b-3a5917dcd327","Type":"ContainerDied","Data":"61fd8c685a75ef7404495a245576d0f8b7d70f957031376b4c1ade8b7e033c97"} Apr 17 20:18:38.563924 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:38.563907 2579 scope.go:117] "RemoveContainer" containerID="61fd8c685a75ef7404495a245576d0f8b7d70f957031376b4c1ade8b7e033c97" Apr 17 20:18:39.206368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:39.206331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c8d5fbf5c-sfqff_0c1607fb-6a2c-4213-8de7-34c392a4fd1c/registry/0.log" Apr 17 20:18:39.211615 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:39.211590 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c8d5fbf5c-sfqff_0c1607fb-6a2c-4213-8de7-34c392a4fd1c/registry/1.log" Apr 17 20:18:39.222589 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:39.222567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-msb4t_963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18/node-ca/0.log" Apr 17 20:18:39.567538 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:39.567454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gw79k" event={"ID":"ee130621-350a-49cf-905b-3a5917dcd327","Type":"ContainerStarted","Data":"02f4956d39c9d648ea7bb4e64c0e58d29843084cab90f112548ab4d8e15e30ff"} Apr 17 20:18:58.567352 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:18:58.567319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c8d5fbf5c-sfqff" Apr 17 20:19:12.460254 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.460217 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:19:12.460618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.460560 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16642ce0-f944-451f-8118-544de2e0acda" containerName="console" Apr 17 20:19:12.460618 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.460576 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="16642ce0-f944-451f-8118-544de2e0acda" containerName="console" Apr 17 20:19:12.460728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.460646 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="16642ce0-f944-451f-8118-544de2e0acda" containerName="console" Apr 17 20:19:12.463805 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.463780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.474884 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.474858 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:19:12.508651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.508896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508671 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.508896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.508896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4wp\" (UniqueName: \"kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.508896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.508896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.509184 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.508902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610039 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4wp\" (UniqueName: \"kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610794 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610901 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.610901 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.610869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.611659 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.611638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.613163 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.613144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.613256 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.613183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.617128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.617108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4wp\" (UniqueName: \"kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp\") pod \"console-bcb7f564c-6w4fw\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.774721 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.774622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:12.901974 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:12.901950 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:19:12.904799 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:19:12.904776 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09875294_d8e4_4034_ae11_1838d7158d64.slice/crio-6a5ac9b3655b738bed106a5fbf3eef6dbdba632ea2d658f4f51ace36d878f790 WatchSource:0}: Error finding container 6a5ac9b3655b738bed106a5fbf3eef6dbdba632ea2d658f4f51ace36d878f790: Status 404 returned error can't find the container with id 6a5ac9b3655b738bed106a5fbf3eef6dbdba632ea2d658f4f51ace36d878f790 Apr 17 20:19:13.673712 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:13.673674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb7f564c-6w4fw" event={"ID":"09875294-d8e4-4034-ae11-1838d7158d64","Type":"ContainerStarted","Data":"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f"} Apr 17 20:19:13.673712 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:13.673713 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb7f564c-6w4fw" event={"ID":"09875294-d8e4-4034-ae11-1838d7158d64","Type":"ContainerStarted","Data":"6a5ac9b3655b738bed106a5fbf3eef6dbdba632ea2d658f4f51ace36d878f790"} Apr 17 20:19:13.689856 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:13.689811 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bcb7f564c-6w4fw" podStartSLOduration=1.689798616 podStartE2EDuration="1.689798616s" podCreationTimestamp="2026-04-17 20:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:19:13.688713267 +0000 UTC m=+223.353638957" watchObservedRunningTime="2026-04-17 20:19:13.689798616 +0000 UTC m=+223.354724306" Apr 17 20:19:22.775043 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:22.775002 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:22.775043 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:22.775047 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:22.779871 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:22.779841 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:23.708890 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:23.708853 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:19:23.752424 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:23.752387 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:19:48.779949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:48.779863 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56d57c576f-6k4pn" podUID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" containerName="console" containerID="cri-o://118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e" gracePeriod=15 Apr 17 20:19:49.022793 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.022719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d57c576f-6k4pn_c26a743e-8bd2-4496-be1d-a5a24a32f42a/console/0.log" Apr 17 20:19:49.022926 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.022837 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:19:49.104418 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.104334 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g"] Apr 17 20:19:49.104687 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.104669 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" containerName="console" Apr 17 20:19:49.104771 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.104689 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" containerName="console" Apr 17 20:19:49.104771 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.104761 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" containerName="console" Apr 17 20:19:49.108032 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.108011 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.110400 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.110365 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:19:49.110400 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.110365 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:19:49.110569 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.110367 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:19:49.114166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114129 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114290 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114223 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114312 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114341 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114414 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cht5z\" (UniqueName: \"kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114506 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114469 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.114557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114503 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca\") pod \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\" (UID: \"c26a743e-8bd2-4496-be1d-a5a24a32f42a\") " Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlng\" (UniqueName: \"kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114774 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.114937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.115356 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.115502 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca" (OuterVolumeSpecName: "service-ca") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.115786 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g"] Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.116070 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:19:49.116310 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.116093 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config" (OuterVolumeSpecName: "console-config") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:19:49.117134 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.117106 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:19:49.117728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.117697 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:19:49.118326 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.118298 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z" (OuterVolumeSpecName: "kube-api-access-cht5z") pod "c26a743e-8bd2-4496-be1d-a5a24a32f42a" (UID: "c26a743e-8bd2-4496-be1d-a5a24a32f42a"). InnerVolumeSpecName "kube-api-access-cht5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:19:49.215679 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlng\" (UniqueName: \"kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.215679 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215797 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215808 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-trusted-ca-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215830 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215839 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26a743e-8bd2-4496-be1d-a5a24a32f42a-console-oauth-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215849 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cht5z\" (UniqueName: \"kubernetes.io/projected/c26a743e-8bd2-4496-be1d-a5a24a32f42a-kube-api-access-cht5z\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215858 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-oauth-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.215905 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.215866 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26a743e-8bd2-4496-be1d-a5a24a32f42a-service-ca\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:19:49.216163 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.216143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.216199 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.216157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.223102 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.223080 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlng\" (UniqueName: \"kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.419200 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.419159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:19:49.539894 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.539866 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g"] Apr 17 20:19:49.541992 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:19:49.541962 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22ca5ce_bbf2_4b55_b2b6_4a71e6d22ddf.slice/crio-405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f WatchSource:0}: Error finding container 405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f: Status 404 returned error can't find the container with id 405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f Apr 17 20:19:49.781072 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.780992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d57c576f-6k4pn_c26a743e-8bd2-4496-be1d-a5a24a32f42a/console/0.log" Apr 17 20:19:49.781072 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.781034 2579 generic.go:358] "Generic (PLEG): container finished" podID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" containerID="118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e" exitCode=2 Apr 17 20:19:49.781562 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.781091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d57c576f-6k4pn" event={"ID":"c26a743e-8bd2-4496-be1d-a5a24a32f42a","Type":"ContainerDied","Data":"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e"} Apr 17 20:19:49.781562 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.781102 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d57c576f-6k4pn" Apr 17 20:19:49.781562 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.781117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d57c576f-6k4pn" event={"ID":"c26a743e-8bd2-4496-be1d-a5a24a32f42a","Type":"ContainerDied","Data":"1ce180312e456604ac8f12063a093b3e5526688c153ac78e419a1d29aee31add"} Apr 17 20:19:49.781562 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.781132 2579 scope.go:117] "RemoveContainer" containerID="118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e" Apr 17 20:19:49.782312 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.782291 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" event={"ID":"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf","Type":"ContainerStarted","Data":"405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f"} Apr 17 20:19:49.789438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.789416 2579 scope.go:117] "RemoveContainer" containerID="118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e" Apr 17 20:19:49.789689 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:19:49.789671 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e\": container with ID starting with 118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e not found: ID does not exist" containerID="118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e" Apr 17 20:19:49.789776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.789696 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e"} err="failed to get container status \"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e\": rpc error: code = NotFound desc = could not find container \"118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e\": container with ID starting with 118681d9dc0c6cab76d8091e6e640167194305bb121c117922bbb544153a3a4e not found: ID does not exist" Apr 17 20:19:49.807984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.807959 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:19:49.811109 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:49.811086 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56d57c576f-6k4pn"] Apr 17 20:19:50.878087 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:50.878055 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26a743e-8bd2-4496-be1d-a5a24a32f42a" path="/var/lib/kubelet/pods/c26a743e-8bd2-4496-be1d-a5a24a32f42a/volumes" Apr 17 20:19:57.810776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:57.810723 2579 generic.go:358] "Generic (PLEG): container finished" podID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerID="1f2ce07a4362c750a805534f6fc67d5557b10575d1cbedc342be6eb91b8233e1" exitCode=0 Apr 17 20:19:57.811167 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:19:57.810817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" event={"ID":"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf","Type":"ContainerDied","Data":"1f2ce07a4362c750a805534f6fc67d5557b10575d1cbedc342be6eb91b8233e1"} Apr 17 20:20:01.823239 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:01.823197 2579 generic.go:358] "Generic (PLEG): container finished" podID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerID="5abc8cf2752f4fda3b09f19af2bf691b9f4f9366cc8915b31d7deb5519f5f0f8" exitCode=0 Apr 17 20:20:01.823715 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:01.823258 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" event={"ID":"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf","Type":"ContainerDied","Data":"5abc8cf2752f4fda3b09f19af2bf691b9f4f9366cc8915b31d7deb5519f5f0f8"} Apr 17 20:20:10.851390 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:10.851345 2579 generic.go:358] "Generic (PLEG): container finished" podID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerID="89ef39bc1232ecd038867d6b143c33971354ff02fe45c533dc3594a5d0fca393" exitCode=0 Apr 17 20:20:10.851868 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:10.851401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" event={"ID":"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf","Type":"ContainerDied","Data":"89ef39bc1232ecd038867d6b143c33971354ff02fe45c533dc3594a5d0fca393"} Apr 17 20:20:11.990319 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:11.990288 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:20:12.120620 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.120525 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle\") pod \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " Apr 17 20:20:12.120620 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.120593 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmlng\" (UniqueName: \"kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng\") pod \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " Apr 17 20:20:12.120904 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.120629 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util\") pod \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\" (UID: \"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf\") " Apr 17 20:20:12.121233 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.121201 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle" (OuterVolumeSpecName: "bundle") pod "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" (UID: "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:20:12.123078 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.123045 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng" (OuterVolumeSpecName: "kube-api-access-tmlng") pod "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" (UID: "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf"). InnerVolumeSpecName "kube-api-access-tmlng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:20:12.125427 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.125399 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util" (OuterVolumeSpecName: "util") pod "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" (UID: "a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:20:12.221526 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.221486 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:12.221526 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.221521 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmlng\" (UniqueName: \"kubernetes.io/projected/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-kube-api-access-tmlng\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:12.221526 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.221532 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:12.858887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.858852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" event={"ID":"a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf","Type":"ContainerDied","Data":"405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f"} Apr 17 20:20:12.858887 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.858891 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405e8fd0f30462d2e46685701a1d63c26672dde625a629c2253124b9f988585f" Apr 17 20:20:12.859104 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:12.858866 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qvs9g" Apr 17 20:20:16.204226 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.204176 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v"] Apr 17 20:20:16.206342 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206314 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="util" Apr 17 20:20:16.206500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206339 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="util" Apr 17 20:20:16.206500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206467 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="pull" Apr 17 20:20:16.206500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206473 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="pull" Apr 17 20:20:16.206500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206482 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="extract" Apr 17 20:20:16.206500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206488 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="extract" Apr 17 20:20:16.206733 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.206565 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a22ca5ce-bbf2-4b55-b2b6-4a71e6d22ddf" containerName="extract" Apr 17 20:20:16.211382 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.211359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.217217 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.217181 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 20:20:16.217406 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.217381 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-nhbmm\"" Apr 17 20:20:16.217486 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.217428 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:20:16.219232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.219207 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v"] Apr 17 20:20:16.358668 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.358631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.358668 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.358673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc6ns\" (UniqueName: \"kubernetes.io/projected/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-kube-api-access-xc6ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.459242 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.459149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.459242 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.459191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc6ns\" (UniqueName: \"kubernetes.io/projected/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-kube-api-access-xc6ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.459553 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.459535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.470346 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.470319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc6ns\" (UniqueName: \"kubernetes.io/projected/9ecc54ca-17e3-47c8-9d01-ea961a018cf9-kube-api-access-xc6ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-w8n8v\" (UID: \"9ecc54ca-17e3-47c8-9d01-ea961a018cf9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.525367 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.525328 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" Apr 17 20:20:16.657341 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.657304 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v"] Apr 17 20:20:16.660177 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:20:16.660151 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ecc54ca_17e3_47c8_9d01_ea961a018cf9.slice/crio-d6d57eef0a3f27bbce527dc4d34a9aa5723c9c450bb2539f2e81f2f1e7d9550e WatchSource:0}: Error finding container d6d57eef0a3f27bbce527dc4d34a9aa5723c9c450bb2539f2e81f2f1e7d9550e: Status 404 returned error can't find the container with id d6d57eef0a3f27bbce527dc4d34a9aa5723c9c450bb2539f2e81f2f1e7d9550e Apr 17 20:20:16.872458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:16.872367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" event={"ID":"9ecc54ca-17e3-47c8-9d01-ea961a018cf9","Type":"ContainerStarted","Data":"d6d57eef0a3f27bbce527dc4d34a9aa5723c9c450bb2539f2e81f2f1e7d9550e"} Apr 17 20:20:27.907099 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:27.907058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" event={"ID":"9ecc54ca-17e3-47c8-9d01-ea961a018cf9","Type":"ContainerStarted","Data":"f7a5863e7979e7a425314d004ab8cb1a803ce2b53a7ce318e9ac52065da29db3"} Apr 17 20:20:27.924812 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:27.924717 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-w8n8v" podStartSLOduration=0.76092902 podStartE2EDuration="11.924698179s" podCreationTimestamp="2026-04-17 20:20:16 +0000 UTC" firstStartedPulling="2026-04-17 20:20:16.662611755 +0000 UTC m=+286.327537427" lastFinishedPulling="2026-04-17 20:20:27.826380913 +0000 UTC m=+297.491306586" observedRunningTime="2026-04-17 20:20:27.92298081 +0000 UTC m=+297.587906510" watchObservedRunningTime="2026-04-17 20:20:27.924698179 +0000 UTC m=+297.589623881" Apr 17 20:20:29.299722 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.299680 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4"] Apr 17 20:20:29.303297 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.303279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.305474 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.305442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:20:29.305591 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.305442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:20:29.306360 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.306329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:20:29.309630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.309605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4"] Apr 17 20:20:29.473936 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.473900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.474120 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.473956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt46s\" (UniqueName: \"kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.474120 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.474000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.575019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.574919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.575019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.574976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt46s\" (UniqueName: \"kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.575019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.575011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.575341 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.575318 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.575405 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.575386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.582891 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.582859 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt46s\" (UniqueName: \"kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.613870 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.613830 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:29.738676 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.738650 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4"] Apr 17 20:20:29.741425 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:20:29.741399 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8cbf476_6fde_42ef_9d8f_4b10e51f9998.slice/crio-490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac WatchSource:0}: Error finding container 490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac: Status 404 returned error can't find the container with id 490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac Apr 17 20:20:29.915658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.915621 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerID="0c988cfff4a24645be751c3171fdc9fc0c633ff179196ab6399d015afb89cbb8" exitCode=0 Apr 17 20:20:29.915824 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.915673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" event={"ID":"e8cbf476-6fde-42ef-9d8f-4b10e51f9998","Type":"ContainerDied","Data":"0c988cfff4a24645be751c3171fdc9fc0c633ff179196ab6399d015afb89cbb8"} Apr 17 20:20:29.915824 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:29.915696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" event={"ID":"e8cbf476-6fde-42ef-9d8f-4b10e51f9998","Type":"ContainerStarted","Data":"490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac"} Apr 17 20:20:30.763003 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:30.762971 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:20:30.763469 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:30.763206 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:20:30.780625 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:30.780599 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:20:31.389628 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.389589 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s2kr6"] Apr 17 20:20:31.392590 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.392575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.394732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.394709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-f2glp\"" Apr 17 20:20:31.394732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.394721 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:20:31.395481 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.395462 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:20:31.402663 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.402636 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s2kr6"] Apr 17 20:20:31.489041 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.489000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.489212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.489136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kszfr\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-kube-api-access-kszfr\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.590619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.590575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kszfr\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-kube-api-access-kszfr\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.590619 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.590627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.598082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.598042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.598253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.598231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kszfr\" (UniqueName: \"kubernetes.io/projected/c2abd673-5c75-42ea-8bfe-135a2eb25f26-kube-api-access-kszfr\") pod \"cert-manager-webhook-597b96b99b-s2kr6\" (UID: \"c2abd673-5c75-42ea-8bfe-135a2eb25f26\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.715919 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.715826 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:31.839942 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.839900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s2kr6"] Apr 17 20:20:31.843609 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:20:31.843579 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2abd673_5c75_42ea_8bfe_135a2eb25f26.slice/crio-e3a31b687d64efbc093f340a6a889068b15ec6d815c55dacd49fd962315471d4 WatchSource:0}: Error finding container e3a31b687d64efbc093f340a6a889068b15ec6d815c55dacd49fd962315471d4: Status 404 returned error can't find the container with id e3a31b687d64efbc093f340a6a889068b15ec6d815c55dacd49fd962315471d4 Apr 17 20:20:31.845410 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.845390 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:20:31.922564 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:31.922527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" event={"ID":"c2abd673-5c75-42ea-8bfe-135a2eb25f26","Type":"ContainerStarted","Data":"e3a31b687d64efbc093f340a6a889068b15ec6d815c55dacd49fd962315471d4"} Apr 17 20:20:34.936004 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:34.935958 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" event={"ID":"c2abd673-5c75-42ea-8bfe-135a2eb25f26","Type":"ContainerStarted","Data":"52657f829f6dc937395ee14e048e1a8a3c889501abcd40632416c9fcebf81acc"} Apr 17 20:20:34.936493 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:34.936041 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:34.952232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:34.952139 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" podStartSLOduration=1.100508807 podStartE2EDuration="3.952124517s" podCreationTimestamp="2026-04-17 20:20:31 +0000 UTC" firstStartedPulling="2026-04-17 20:20:31.845551593 +0000 UTC m=+301.510477261" lastFinishedPulling="2026-04-17 20:20:34.697167296 +0000 UTC m=+304.362092971" observedRunningTime="2026-04-17 20:20:34.949801661 +0000 UTC m=+304.614727351" watchObservedRunningTime="2026-04-17 20:20:34.952124517 +0000 UTC m=+304.617050204" Apr 17 20:20:40.941896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:40.941860 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-s2kr6" Apr 17 20:20:45.979686 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:45.979649 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerID="d75cd246b42267eeb6b6b3eb5c6c6b5e5ec1253b5a0c9d15c3aad4ad2db1b542" exitCode=0 Apr 17 20:20:45.980080 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:45.979728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" event={"ID":"e8cbf476-6fde-42ef-9d8f-4b10e51f9998","Type":"ContainerDied","Data":"d75cd246b42267eeb6b6b3eb5c6c6b5e5ec1253b5a0c9d15c3aad4ad2db1b542"} Apr 17 20:20:46.990207 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:46.990172 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerID="310a915353397b66a09526ff5e7f0da263d3247ff8cc308ce2bc4d912c97af47" exitCode=0 Apr 17 20:20:46.990603 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:46.990261 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" event={"ID":"e8cbf476-6fde-42ef-9d8f-4b10e51f9998","Type":"ContainerDied","Data":"310a915353397b66a09526ff5e7f0da263d3247ff8cc308ce2bc4d912c97af47"} Apr 17 20:20:48.114700 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.114675 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:48.135163 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.135127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle\") pod \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " Apr 17 20:20:48.135342 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.135210 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt46s\" (UniqueName: \"kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s\") pod \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " Apr 17 20:20:48.135342 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.135247 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util\") pod \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\" (UID: \"e8cbf476-6fde-42ef-9d8f-4b10e51f9998\") " Apr 17 20:20:48.135629 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.135596 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle" (OuterVolumeSpecName: "bundle") pod "e8cbf476-6fde-42ef-9d8f-4b10e51f9998" (UID: "e8cbf476-6fde-42ef-9d8f-4b10e51f9998"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:20:48.137445 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.137411 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s" (OuterVolumeSpecName: "kube-api-access-jt46s") pod "e8cbf476-6fde-42ef-9d8f-4b10e51f9998" (UID: "e8cbf476-6fde-42ef-9d8f-4b10e51f9998"). InnerVolumeSpecName "kube-api-access-jt46s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:20:48.140395 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.140369 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util" (OuterVolumeSpecName: "util") pod "e8cbf476-6fde-42ef-9d8f-4b10e51f9998" (UID: "e8cbf476-6fde-42ef-9d8f-4b10e51f9998"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:20:48.236914 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.236879 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:48.236914 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.236910 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jt46s\" (UniqueName: \"kubernetes.io/projected/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-kube-api-access-jt46s\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:48.236914 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.236921 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8cbf476-6fde-42ef-9d8f-4b10e51f9998-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:20:48.998369 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.998336 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" Apr 17 20:20:48.998504 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.998335 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzrf4" event={"ID":"e8cbf476-6fde-42ef-9d8f-4b10e51f9998","Type":"ContainerDied","Data":"490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac"} Apr 17 20:20:48.998504 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:48.998444 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490f26efe5c1b0f551e09844a90d1ecbffd08d94cb1bc4b1b46888c8ab9eaaac" Apr 17 20:20:49.602032 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.601994 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-gnvp6"] Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602326 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="pull" Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602341 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="pull" Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602354 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="extract" Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602359 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="extract" Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602371 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="util" Apr 17 20:20:49.602392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602376 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="util" Apr 17 20:20:49.602574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.602434 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8cbf476-6fde-42ef-9d8f-4b10e51f9998" containerName="extract" Apr 17 20:20:49.605602 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.605582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.607695 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.607678 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-ccgf5\"" Apr 17 20:20:49.612156 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.612122 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gnvp6"] Apr 17 20:20:49.649513 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.649471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdps\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-kube-api-access-9qdps\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.649675 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.649537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-bound-sa-token\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.750278 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.750234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdps\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-kube-api-access-9qdps\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.750518 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.750295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-bound-sa-token\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.757966 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.757930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-bound-sa-token\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.758083 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.758046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdps\" (UniqueName: \"kubernetes.io/projected/aa7f5e53-6e86-48f7-8a16-9e59035a5078-kube-api-access-9qdps\") pod \"cert-manager-759f64656b-gnvp6\" (UID: \"aa7f5e53-6e86-48f7-8a16-9e59035a5078\") " pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:49.916660 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:49.916624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gnvp6" Apr 17 20:20:50.037270 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:50.037241 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gnvp6"] Apr 17 20:20:50.039806 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:20:50.039777 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa7f5e53_6e86_48f7_8a16_9e59035a5078.slice/crio-5126e03bf4502c39c02c02871a3b6b5941fe1098cf316b41531bfd3c4c83948c WatchSource:0}: Error finding container 5126e03bf4502c39c02c02871a3b6b5941fe1098cf316b41531bfd3c4c83948c: Status 404 returned error can't find the container with id 5126e03bf4502c39c02c02871a3b6b5941fe1098cf316b41531bfd3c4c83948c Apr 17 20:20:51.006422 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:51.006371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gnvp6" event={"ID":"aa7f5e53-6e86-48f7-8a16-9e59035a5078","Type":"ContainerStarted","Data":"2f924624122c6065dee1d23f1592a12381dfec3918efb26fc27c3850e21f9f09"} Apr 17 20:20:51.006422 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:51.006427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gnvp6" event={"ID":"aa7f5e53-6e86-48f7-8a16-9e59035a5078","Type":"ContainerStarted","Data":"5126e03bf4502c39c02c02871a3b6b5941fe1098cf316b41531bfd3c4c83948c"} Apr 17 20:20:51.021172 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:51.021111 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-gnvp6" podStartSLOduration=2.021093715 podStartE2EDuration="2.021093715s" podCreationTimestamp="2026-04-17 20:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:20:51.020688034 +0000 UTC m=+320.685613723" watchObservedRunningTime="2026-04-17 20:20:51.021093715 +0000 UTC m=+320.686019405" Apr 17 20:20:59.542980 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.542942 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc"] Apr 17 20:20:59.545442 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.545425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.547491 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.547464 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:20:59.547491 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.547468 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:20:59.548326 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.548309 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:20:59.553563 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.553540 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc"] Apr 17 20:20:59.629414 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.629377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.629610 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.629433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgk5s\" (UniqueName: \"kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.629610 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.629483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.730625 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.730589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.730806 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.730634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgk5s\" (UniqueName: \"kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.730806 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.730785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.731044 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.731026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.731084 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.731068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.745189 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.745160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgk5s\" (UniqueName: \"kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.855985 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.855892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:20:59.986495 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:20:59.986465 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc"] Apr 17 20:20:59.988908 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:20:59.988868 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e5f9da_e0e6_4215_83e1_fad8db26d7ac.slice/crio-6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934 WatchSource:0}: Error finding container 6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934: Status 404 returned error can't find the container with id 6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934 Apr 17 20:21:00.038712 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:00.038680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" event={"ID":"81e5f9da-e0e6-4215-83e1-fad8db26d7ac","Type":"ContainerStarted","Data":"6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934"} Apr 17 20:21:01.043963 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:01.043874 2579 generic.go:358] "Generic (PLEG): container finished" podID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerID="dd441e84799f2766c40fdde0f002572b50756ebdcbf885f2e2d2d562a585b7c0" exitCode=0 Apr 17 20:21:01.044370 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:01.043957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" event={"ID":"81e5f9da-e0e6-4215-83e1-fad8db26d7ac","Type":"ContainerDied","Data":"dd441e84799f2766c40fdde0f002572b50756ebdcbf885f2e2d2d562a585b7c0"} Apr 17 20:21:02.048674 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:02.048642 2579 generic.go:358] "Generic (PLEG): container finished" podID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerID="8a8182788cd2eb08210905d96ebae4c0bd5272b69818c81f93b435b729276961" exitCode=0 Apr 17 20:21:02.049076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:02.048723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" event={"ID":"81e5f9da-e0e6-4215-83e1-fad8db26d7ac","Type":"ContainerDied","Data":"8a8182788cd2eb08210905d96ebae4c0bd5272b69818c81f93b435b729276961"} Apr 17 20:21:03.054176 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:03.054139 2579 generic.go:358] "Generic (PLEG): container finished" podID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerID="01f7cce471c2f4607f37c6953fb5e659dcfe07291683a01449e82616015983e4" exitCode=0 Apr 17 20:21:03.054614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:03.054222 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" event={"ID":"81e5f9da-e0e6-4215-83e1-fad8db26d7ac","Type":"ContainerDied","Data":"01f7cce471c2f4607f37c6953fb5e659dcfe07291683a01449e82616015983e4"} Apr 17 20:21:04.179025 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.178998 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:21:04.264146 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.264101 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util\") pod \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " Apr 17 20:21:04.264324 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.264165 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle\") pod \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " Apr 17 20:21:04.264324 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.264219 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgk5s\" (UniqueName: \"kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s\") pod \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\" (UID: \"81e5f9da-e0e6-4215-83e1-fad8db26d7ac\") " Apr 17 20:21:04.264875 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.264843 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle" (OuterVolumeSpecName: "bundle") pod "81e5f9da-e0e6-4215-83e1-fad8db26d7ac" (UID: "81e5f9da-e0e6-4215-83e1-fad8db26d7ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:04.266364 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.266336 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s" (OuterVolumeSpecName: "kube-api-access-mgk5s") pod "81e5f9da-e0e6-4215-83e1-fad8db26d7ac" (UID: "81e5f9da-e0e6-4215-83e1-fad8db26d7ac"). InnerVolumeSpecName "kube-api-access-mgk5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:21:04.269077 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.269054 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util" (OuterVolumeSpecName: "util") pod "81e5f9da-e0e6-4215-83e1-fad8db26d7ac" (UID: "81e5f9da-e0e6-4215-83e1-fad8db26d7ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:04.365630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.365539 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:04.365630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.365572 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:04.365630 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:04.365582 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgk5s\" (UniqueName: \"kubernetes.io/projected/81e5f9da-e0e6-4215-83e1-fad8db26d7ac-kube-api-access-mgk5s\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:05.062275 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:05.062238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" event={"ID":"81e5f9da-e0e6-4215-83e1-fad8db26d7ac","Type":"ContainerDied","Data":"6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934"} Apr 17 20:21:05.062275 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:05.062276 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a4815af4192e5189bd1c9c73f830d4e417a53ca98859a40df17eb2c5d303934" Apr 17 20:21:05.062471 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:05.062257 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5x5pdc" Apr 17 20:21:16.492721 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.492681 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr"] Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493093 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="extract" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493107 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="extract" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493130 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="util" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493136 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="util" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493146 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="pull" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493152 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="pull" Apr 17 20:21:16.493212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.493211 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="81e5f9da-e0e6-4215-83e1-fad8db26d7ac" containerName="extract" Apr 17 20:21:16.495979 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.495956 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.498352 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.498324 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:21:16.498484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.498390 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:21:16.498484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.498420 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:21:16.498484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.498399 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:21:16.498650 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.498624 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5wc67\"" Apr 17 20:21:16.511665 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.511638 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr"] Apr 17 20:21:16.569250 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.569214 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czmbq\" (UniqueName: \"kubernetes.io/projected/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-kube-api-access-czmbq\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.569441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.569265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.569441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.569295 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.591370 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.591335 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2"] Apr 17 20:21:16.595177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.594531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.597339 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.597319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:21:16.597560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.597534 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:21:16.597821 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.597803 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:21:16.604601 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.604577 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2"] Apr 17 20:21:16.670306 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.670498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.670498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670351 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.670498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.670498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb5v\" (UniqueName: \"kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.670641 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.670512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czmbq\" (UniqueName: \"kubernetes.io/projected/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-kube-api-access-czmbq\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.673103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.673069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.673239 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.673123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.688689 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.688662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czmbq\" (UniqueName: \"kubernetes.io/projected/c4ee013b-81d0-4948-b03e-47eb5ced8bfc-kube-api-access-czmbq\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-2pwvr\" (UID: \"c4ee013b-81d0-4948-b03e-47eb5ced8bfc\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.771837 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.771756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.771982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.771903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.771982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.771928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngb5v\" (UniqueName: \"kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.772235 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.772211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.772267 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.772214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.785394 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.785362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngb5v\" (UniqueName: \"kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.806177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.806139 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:16.905149 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.905112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:16.940739 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:16.940714 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr"] Apr 17 20:21:16.944363 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:16.944333 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ee013b_81d0_4948_b03e_47eb5ced8bfc.slice/crio-26c31167391deed4172b202de2a9669a77ae8db1b83c26d761638319501b6d05 WatchSource:0}: Error finding container 26c31167391deed4172b202de2a9669a77ae8db1b83c26d761638319501b6d05: Status 404 returned error can't find the container with id 26c31167391deed4172b202de2a9669a77ae8db1b83c26d761638319501b6d05 Apr 17 20:21:17.037056 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:17.037026 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2"] Apr 17 20:21:17.039289 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:17.039260 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6e2e6e_ca5b_4f1c_aa16_0a5bec1c2285.slice/crio-43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa WatchSource:0}: Error finding container 43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa: Status 404 returned error can't find the container with id 43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa Apr 17 20:21:17.105554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:17.105523 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" event={"ID":"c4ee013b-81d0-4948-b03e-47eb5ced8bfc","Type":"ContainerStarted","Data":"26c31167391deed4172b202de2a9669a77ae8db1b83c26d761638319501b6d05"} Apr 17 20:21:17.107071 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:17.107045 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerStarted","Data":"4f8d29de4343aa124bf3a3863949f4a6df1a005ede954c09e6fee4f2ea965f9d"} Apr 17 20:21:17.107166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:17.107077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerStarted","Data":"43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa"} Apr 17 20:21:18.113838 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.113608 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerID="4f8d29de4343aa124bf3a3863949f4a6df1a005ede954c09e6fee4f2ea965f9d" exitCode=0 Apr 17 20:21:18.113838 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.113701 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerDied","Data":"4f8d29de4343aa124bf3a3863949f4a6df1a005ede954c09e6fee4f2ea965f9d"} Apr 17 20:21:18.584553 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.584521 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt"] Apr 17 20:21:18.588524 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.588499 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.591058 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591029 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:21:18.591181 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591029 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:21:18.591830 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591803 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-96pmd\"" Apr 17 20:21:18.591830 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591830 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:21:18.592003 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591844 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:21:18.592003 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.591872 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:21:18.596082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.596044 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt"] Apr 17 20:21:18.688051 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.688004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ff4e6c44-621c-48b8-9698-958eb20c1f4b-manager-config\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.688284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.688127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.688284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.688166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-metrics-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.688284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.688210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf52n\" (UniqueName: \"kubernetes.io/projected/ff4e6c44-621c-48b8-9698-958eb20c1f4b-kube-api-access-jf52n\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.789532 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.789494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf52n\" (UniqueName: \"kubernetes.io/projected/ff4e6c44-621c-48b8-9698-958eb20c1f4b-kube-api-access-jf52n\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.789702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.789541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ff4e6c44-621c-48b8-9698-958eb20c1f4b-manager-config\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.789702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.789633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.789702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.789663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-metrics-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.790347 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.790308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ff4e6c44-621c-48b8-9698-958eb20c1f4b-manager-config\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.792559 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.792533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-metrics-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.792771 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.792732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff4e6c44-621c-48b8-9698-958eb20c1f4b-cert\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.807875 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.807841 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf52n\" (UniqueName: \"kubernetes.io/projected/ff4e6c44-621c-48b8-9698-958eb20c1f4b-kube-api-access-jf52n\") pod \"lws-controller-manager-fd99964b4-6ddgt\" (UID: \"ff4e6c44-621c-48b8-9698-958eb20c1f4b\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:18.901260 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:18.901227 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:19.539652 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:19.539628 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt"] Apr 17 20:21:19.542889 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:19.542864 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4e6c44_621c_48b8_9698_958eb20c1f4b.slice/crio-73940bc15f8f6e7658978b095b134fe7974ab4199438fe00086906571139f7ec WatchSource:0}: Error finding container 73940bc15f8f6e7658978b095b134fe7974ab4199438fe00086906571139f7ec: Status 404 returned error can't find the container with id 73940bc15f8f6e7658978b095b134fe7974ab4199438fe00086906571139f7ec Apr 17 20:21:20.123231 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.123130 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" event={"ID":"c4ee013b-81d0-4948-b03e-47eb5ced8bfc","Type":"ContainerStarted","Data":"2a40ade44ab54d0c322afc2c2d83c1630506a0d7be654795240d7ea636f261ef"} Apr 17 20:21:20.123401 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.123257 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:20.124241 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.124210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" event={"ID":"ff4e6c44-621c-48b8-9698-958eb20c1f4b","Type":"ContainerStarted","Data":"73940bc15f8f6e7658978b095b134fe7974ab4199438fe00086906571139f7ec"} Apr 17 20:21:20.125765 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.125720 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerID="1afff9b2d0ce3abbd3b01fc7b8e4ee0c5e62ee71bfeb68244dcb2e3de3aa9989" exitCode=0 Apr 17 20:21:20.125859 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.125771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerDied","Data":"1afff9b2d0ce3abbd3b01fc7b8e4ee0c5e62ee71bfeb68244dcb2e3de3aa9989"} Apr 17 20:21:20.145891 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:20.145845 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" podStartSLOduration=1.632703989 podStartE2EDuration="4.14582951s" podCreationTimestamp="2026-04-17 20:21:16 +0000 UTC" firstStartedPulling="2026-04-17 20:21:16.946360191 +0000 UTC m=+346.611285875" lastFinishedPulling="2026-04-17 20:21:19.459485728 +0000 UTC m=+349.124411396" observedRunningTime="2026-04-17 20:21:20.143408401 +0000 UTC m=+349.808334090" watchObservedRunningTime="2026-04-17 20:21:20.14582951 +0000 UTC m=+349.810755252" Apr 17 20:21:21.132191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:21.132159 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerID="c36728bb02ffb27f5d347187a83206fbb0dc5a0a4958e3e816d11f1ac1839c37" exitCode=0 Apr 17 20:21:21.132568 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:21.132249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerDied","Data":"c36728bb02ffb27f5d347187a83206fbb0dc5a0a4958e3e816d11f1ac1839c37"} Apr 17 20:21:22.137640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.137603 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" event={"ID":"ff4e6c44-621c-48b8-9698-958eb20c1f4b","Type":"ContainerStarted","Data":"6b3b04aae0d3cfb5bc6d69d9063ff2cea59734ae40132385c16b55d48db9f334"} Apr 17 20:21:22.138225 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.137663 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:22.154501 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.154452 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" podStartSLOduration=2.603175557 podStartE2EDuration="4.154437018s" podCreationTimestamp="2026-04-17 20:21:18 +0000 UTC" firstStartedPulling="2026-04-17 20:21:19.545152079 +0000 UTC m=+349.210077747" lastFinishedPulling="2026-04-17 20:21:21.09641354 +0000 UTC m=+350.761339208" observedRunningTime="2026-04-17 20:21:22.153056023 +0000 UTC m=+351.817981713" watchObservedRunningTime="2026-04-17 20:21:22.154437018 +0000 UTC m=+351.819362707" Apr 17 20:21:22.263461 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.263424 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:22.326520 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.326485 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngb5v\" (UniqueName: \"kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v\") pod \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " Apr 17 20:21:22.326684 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.326533 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle\") pod \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " Apr 17 20:21:22.326684 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.326599 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util\") pod \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\" (UID: \"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285\") " Apr 17 20:21:22.327338 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.327309 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle" (OuterVolumeSpecName: "bundle") pod "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" (UID: "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:22.328773 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.328735 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v" (OuterVolumeSpecName: "kube-api-access-ngb5v") pod "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" (UID: "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285"). InnerVolumeSpecName "kube-api-access-ngb5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:21:22.331654 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.331610 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util" (OuterVolumeSpecName: "util") pod "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" (UID: "0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:22.427962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.427883 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:22.427962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.427917 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngb5v\" (UniqueName: \"kubernetes.io/projected/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-kube-api-access-ngb5v\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:22.427962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:22.427927 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:23.143050 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:23.142952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" event={"ID":"0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285","Type":"ContainerDied","Data":"43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa"} Apr 17 20:21:23.143050 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:23.143002 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c76caa4f80d6dfdadfd61b1b3ae94785cd054c2c17c64aee9a7beea4edd0fa" Apr 17 20:21:23.143050 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:23.142965 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9ssts2" Apr 17 20:21:31.134638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:31.134606 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-2pwvr" Apr 17 20:21:33.145901 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.145869 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-6ddgt" Apr 17 20:21:33.342454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342418 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds"] Apr 17 20:21:33.342853 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342826 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="pull" Apr 17 20:21:33.342853 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342848 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="pull" Apr 17 20:21:33.343012 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342882 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="util" Apr 17 20:21:33.343012 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342894 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="util" Apr 17 20:21:33.343012 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342932 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="extract" Apr 17 20:21:33.343012 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.342941 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="extract" Apr 17 20:21:33.343142 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.343034 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f6e2e6e-ca5b-4f1c-aa16-0a5bec1c2285" containerName="extract" Apr 17 20:21:33.347298 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.347280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.349470 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.349449 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:21:33.349614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.349595 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:21:33.349678 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.349665 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:21:33.354147 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.354121 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds"] Apr 17 20:21:33.437159 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.437060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.437318 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.437161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz7k\" (UniqueName: \"kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.437318 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.437198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.537810 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.537766 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz7k\" (UniqueName: \"kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.538022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.537826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.538022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.537862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.538262 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.538240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.538307 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.538281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.548431 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.548393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz7k\" (UniqueName: \"kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.657187 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.657146 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:33.790428 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:33.790402 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds"] Apr 17 20:21:33.793092 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:33.793060 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c301a58_92d8_49ed_9172_fa4d92ba853e.slice/crio-576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123 WatchSource:0}: Error finding container 576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123: Status 404 returned error can't find the container with id 576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123 Apr 17 20:21:34.183487 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:34.183450 2579 generic.go:358] "Generic (PLEG): container finished" podID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerID="d7278f7f039cc2ff4d76ffc76573ab419c2574bbd214157920523066d481dcf6" exitCode=0 Apr 17 20:21:34.183918 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:34.183544 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" event={"ID":"8c301a58-92d8-49ed-9172-fa4d92ba853e","Type":"ContainerDied","Data":"d7278f7f039cc2ff4d76ffc76573ab419c2574bbd214157920523066d481dcf6"} Apr 17 20:21:34.183918 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:34.183574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" event={"ID":"8c301a58-92d8-49ed-9172-fa4d92ba853e","Type":"ContainerStarted","Data":"576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123"} Apr 17 20:21:35.189145 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:35.189108 2579 generic.go:358] "Generic (PLEG): container finished" podID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerID="f6311a6280366ee6942069a0550735ecfa2798b9084726749b846a96b729a17e" exitCode=0 Apr 17 20:21:35.189599 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:35.189193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" event={"ID":"8c301a58-92d8-49ed-9172-fa4d92ba853e","Type":"ContainerDied","Data":"f6311a6280366ee6942069a0550735ecfa2798b9084726749b846a96b729a17e"} Apr 17 20:21:36.197261 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:36.197223 2579 generic.go:358] "Generic (PLEG): container finished" podID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerID="9822ef6940047bfa4a310e89eec240f16669ee52f5e90565533b94be5a3b3d34" exitCode=0 Apr 17 20:21:36.197639 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:36.197280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" event={"ID":"8c301a58-92d8-49ed-9172-fa4d92ba853e","Type":"ContainerDied","Data":"9822ef6940047bfa4a310e89eec240f16669ee52f5e90565533b94be5a3b3d34"} Apr 17 20:21:37.321848 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.321825 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:37.474059 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.473971 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbz7k\" (UniqueName: \"kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k\") pod \"8c301a58-92d8-49ed-9172-fa4d92ba853e\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " Apr 17 20:21:37.474059 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.474057 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle\") pod \"8c301a58-92d8-49ed-9172-fa4d92ba853e\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " Apr 17 20:21:37.474266 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.474085 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util\") pod \"8c301a58-92d8-49ed-9172-fa4d92ba853e\" (UID: \"8c301a58-92d8-49ed-9172-fa4d92ba853e\") " Apr 17 20:21:37.474988 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.474961 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle" (OuterVolumeSpecName: "bundle") pod "8c301a58-92d8-49ed-9172-fa4d92ba853e" (UID: "8c301a58-92d8-49ed-9172-fa4d92ba853e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:37.476331 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.476306 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k" (OuterVolumeSpecName: "kube-api-access-jbz7k") pod "8c301a58-92d8-49ed-9172-fa4d92ba853e" (UID: "8c301a58-92d8-49ed-9172-fa4d92ba853e"). InnerVolumeSpecName "kube-api-access-jbz7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:21:37.479605 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.479580 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util" (OuterVolumeSpecName: "util") pod "8c301a58-92d8-49ed-9172-fa4d92ba853e" (UID: "8c301a58-92d8-49ed-9172-fa4d92ba853e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:37.575591 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.575554 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jbz7k\" (UniqueName: \"kubernetes.io/projected/8c301a58-92d8-49ed-9172-fa4d92ba853e-kube-api-access-jbz7k\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:37.575591 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.575587 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:37.575591 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:37.575596 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c301a58-92d8-49ed-9172-fa4d92ba853e-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:38.206274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:38.206238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" event={"ID":"8c301a58-92d8-49ed-9172-fa4d92ba853e","Type":"ContainerDied","Data":"576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123"} Apr 17 20:21:38.206274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:38.206274 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="576f2d0981b18007e22c05555d917107fd13379bb8878e8e8ac398a0a593d123" Apr 17 20:21:38.206552 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:38.206294 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dnpds" Apr 17 20:21:47.531200 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531162 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b"] Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531525 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="extract" Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531537 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="extract" Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531546 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="pull" Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531552 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="pull" Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531561 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="util" Apr 17 20:21:47.531584 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531570 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="util" Apr 17 20:21:47.531818 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.531635 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c301a58-92d8-49ed-9172-fa4d92ba853e" containerName="extract" Apr 17 20:21:47.536023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.536002 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.538728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.538601 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd8dn\"" Apr 17 20:21:47.538875 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.538765 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:21:47.539601 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.539585 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:21:47.546384 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.546352 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b"] Apr 17 20:21:47.558820 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.558792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.558972 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.558829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.558972 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.558854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpj62\" (UniqueName: \"kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.659987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.659947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.659987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.659988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.660248 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.660011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpj62\" (UniqueName: \"kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.660363 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.660338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.660431 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.660401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.668512 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.668480 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpj62\" (UniqueName: \"kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.846688 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.846583 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:47.975947 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:47.975918 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b"] Apr 17 20:21:47.977778 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:47.977726 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe9bf5ee_37ad_41d1_9787_5138bf48d4af.slice/crio-2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6 WatchSource:0}: Error finding container 2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6: Status 404 returned error can't find the container with id 2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6 Apr 17 20:21:48.241675 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:48.241639 2579 generic.go:358] "Generic (PLEG): container finished" podID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerID="b9c18488a9bb4278983fd5fb2ce8be9e5c163079b57283ac254b6097ff4ed945" exitCode=0 Apr 17 20:21:48.241899 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:48.241711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" event={"ID":"fe9bf5ee-37ad-41d1-9787-5138bf48d4af","Type":"ContainerDied","Data":"b9c18488a9bb4278983fd5fb2ce8be9e5c163079b57283ac254b6097ff4ed945"} Apr 17 20:21:48.241899 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:48.241764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" event={"ID":"fe9bf5ee-37ad-41d1-9787-5138bf48d4af","Type":"ContainerStarted","Data":"2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6"} Apr 17 20:21:49.247202 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:49.247104 2579 generic.go:358] "Generic (PLEG): container finished" podID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerID="6e3239a890e9891f912f733845856d004eba752ca0be1f5d8e0dd75c8a1a809d" exitCode=0 Apr 17 20:21:49.247202 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:49.247190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" event={"ID":"fe9bf5ee-37ad-41d1-9787-5138bf48d4af","Type":"ContainerDied","Data":"6e3239a890e9891f912f733845856d004eba752ca0be1f5d8e0dd75c8a1a809d"} Apr 17 20:21:50.253099 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:50.253065 2579 generic.go:358] "Generic (PLEG): container finished" podID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerID="53f1e0345996bb3630a7a4d1e5c338a779bdc83b7aa51885d32575c000acb547" exitCode=0 Apr 17 20:21:50.253544 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:50.253149 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" event={"ID":"fe9bf5ee-37ad-41d1-9787-5138bf48d4af","Type":"ContainerDied","Data":"53f1e0345996bb3630a7a4d1e5c338a779bdc83b7aa51885d32575c000acb547"} Apr 17 20:21:51.383532 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.383506 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:51.393679 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.393638 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle\") pod \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " Apr 17 20:21:51.393833 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.393709 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpj62\" (UniqueName: \"kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62\") pod \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " Apr 17 20:21:51.393833 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.393769 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util\") pod \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\" (UID: \"fe9bf5ee-37ad-41d1-9787-5138bf48d4af\") " Apr 17 20:21:51.394548 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.394514 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle" (OuterVolumeSpecName: "bundle") pod "fe9bf5ee-37ad-41d1-9787-5138bf48d4af" (UID: "fe9bf5ee-37ad-41d1-9787-5138bf48d4af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:51.396083 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.396053 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62" (OuterVolumeSpecName: "kube-api-access-jpj62") pod "fe9bf5ee-37ad-41d1-9787-5138bf48d4af" (UID: "fe9bf5ee-37ad-41d1-9787-5138bf48d4af"). InnerVolumeSpecName "kube-api-access-jpj62". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:21:51.400038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.400009 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util" (OuterVolumeSpecName: "util") pod "fe9bf5ee-37ad-41d1-9787-5138bf48d4af" (UID: "fe9bf5ee-37ad-41d1-9787-5138bf48d4af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:21:51.494651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.494612 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jpj62\" (UniqueName: \"kubernetes.io/projected/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-kube-api-access-jpj62\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:51.494651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.494646 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:51.494651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:51.494656 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe9bf5ee-37ad-41d1-9787-5138bf48d4af-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:21:52.261813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:52.261772 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" event={"ID":"fe9bf5ee-37ad-41d1-9787-5138bf48d4af","Type":"ContainerDied","Data":"2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6"} Apr 17 20:21:52.261813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:52.261815 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c75dfe4f38012a3765de4b25959884d2f6c7528cf04769e90dc884539c87ac6" Apr 17 20:21:52.261813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:52.261817 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hzg5b" Apr 17 20:21:57.748962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.748923 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:21:57.749422 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749403 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="util" Apr 17 20:21:57.749465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749429 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="util" Apr 17 20:21:57.749465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749444 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="pull" Apr 17 20:21:57.749465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749453 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="pull" Apr 17 20:21:57.749570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749464 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="extract" Apr 17 20:21:57.749570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749474 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="extract" Apr 17 20:21:57.749633 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.749589 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe9bf5ee-37ad-41d1-9787-5138bf48d4af" containerName="extract" Apr 17 20:21:57.758186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.758164 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.760937 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.760703 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:21:57.760937 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.760785 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:21:57.761142 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.760966 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-frd8t\"" Apr 17 20:21:57.762394 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.762372 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:21:57.765012 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.764989 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:21:57.826238 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.826200 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj"] Apr 17 20:21:57.830220 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.830197 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.840502 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.840480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj"] Apr 17 20:21:57.846730 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.846702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.846868 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.846762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.846868 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.846792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.846942 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.846911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.846994 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.846949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.847068 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.847048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.847130 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.847104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds56w\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.847186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.847158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.847240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.847189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948085 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948085 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948285 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948285 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948380 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948432 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds56w\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bkr5\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-kube-api-access-6bkr5\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948637 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948489 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948637 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948637 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948637 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:57.948874 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.949187 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948901 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.949187 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.948952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.949187 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.949134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.949349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.949192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.951302 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.951280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.951579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.951559 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.958056 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.958026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:57.958439 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:57.958401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds56w\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:58.049577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049577 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bkr5\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-kube-api-access-6bkr5\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.049977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.049786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.050232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.050045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.050232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.050073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.050338 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.050269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.050566 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.050378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.050566 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.050413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.051978 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.051954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.052173 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.052156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.057134 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.057105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.057237 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.057216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bkr5\" (UniqueName: \"kubernetes.io/projected/6c8d819c-bde4-4f6a-8afd-61f822c02c75-kube-api-access-6bkr5\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj\" (UID: \"6c8d819c-bde4-4f6a-8afd-61f822c02c75\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.071457 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.071422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:21:58.141724 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.141686 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:21:58.208304 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.208275 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:21:58.212167 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:58.212129 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a9ee71_da70_4ec3_b8af_32f87e1ce1aa.slice/crio-4cf9b7d5311a2626bee84236ad1a06010d386837e8a782daaa17af5a8e448516 WatchSource:0}: Error finding container 4cf9b7d5311a2626bee84236ad1a06010d386837e8a782daaa17af5a8e448516: Status 404 returned error can't find the container with id 4cf9b7d5311a2626bee84236ad1a06010d386837e8a782daaa17af5a8e448516 Apr 17 20:21:58.281935 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.281908 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj"] Apr 17 20:21:58.283805 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:58.283773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" event={"ID":"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa","Type":"ContainerStarted","Data":"4cf9b7d5311a2626bee84236ad1a06010d386837e8a782daaa17af5a8e448516"} Apr 17 20:21:58.284474 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:21:58.284449 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c8d819c_bde4_4f6a_8afd_61f822c02c75.slice/crio-7944a2136397f9d97979eaa6b87920b3a99923a6e44fb815d3a49322cf5cd6f6 WatchSource:0}: Error finding container 7944a2136397f9d97979eaa6b87920b3a99923a6e44fb815d3a49322cf5cd6f6: Status 404 returned error can't find the container with id 7944a2136397f9d97979eaa6b87920b3a99923a6e44fb815d3a49322cf5cd6f6 Apr 17 20:21:59.292300 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:21:59.292241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" event={"ID":"6c8d819c-bde4-4f6a-8afd-61f822c02c75","Type":"ContainerStarted","Data":"7944a2136397f9d97979eaa6b87920b3a99923a6e44fb815d3a49322cf5cd6f6"} Apr 17 20:22:00.805238 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.805192 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:00.805648 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.805267 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:00.805648 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.805296 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:00.811278 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.811245 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:00.811375 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.811311 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:00.811375 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:00.811340 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:22:01.301415 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:01.301375 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" event={"ID":"6c8d819c-bde4-4f6a-8afd-61f822c02c75","Type":"ContainerStarted","Data":"2d34f89d71082e9b9b39efd130b5c657f3a77094ffb0807787c94696af426759"} Apr 17 20:22:01.302801 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:01.302774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" event={"ID":"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa","Type":"ContainerStarted","Data":"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499"} Apr 17 20:22:01.323606 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:01.323553 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" podStartSLOduration=1.8048915509999999 podStartE2EDuration="4.323536731s" podCreationTimestamp="2026-04-17 20:21:57 +0000 UTC" firstStartedPulling="2026-04-17 20:21:58.286287449 +0000 UTC m=+387.951213120" lastFinishedPulling="2026-04-17 20:22:00.80493263 +0000 UTC m=+390.469858300" observedRunningTime="2026-04-17 20:22:01.321325021 +0000 UTC m=+390.986250759" watchObservedRunningTime="2026-04-17 20:22:01.323536731 +0000 UTC m=+390.988462422" Apr 17 20:22:01.339246 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:01.339189 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" podStartSLOduration=1.7421798229999998 podStartE2EDuration="4.339173819s" podCreationTimestamp="2026-04-17 20:21:57 +0000 UTC" firstStartedPulling="2026-04-17 20:21:58.214033399 +0000 UTC m=+387.878959068" lastFinishedPulling="2026-04-17 20:22:00.811027381 +0000 UTC m=+390.475953064" observedRunningTime="2026-04-17 20:22:01.336564676 +0000 UTC m=+391.001490366" watchObservedRunningTime="2026-04-17 20:22:01.339173819 +0000 UTC m=+391.004099508" Apr 17 20:22:02.072243 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.072199 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:22:02.073794 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.073767 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" start-of-body= Apr 17 20:22:02.073867 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.073835 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" Apr 17 20:22:02.142635 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.142595 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:22:02.147245 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.147220 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:22:02.307092 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.307058 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:22:02.308023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.307999 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj" Apr 17 20:22:02.354239 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:02.354157 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:22:03.071940 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:03.071903 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" start-of-body= Apr 17 20:22:03.072140 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:03.071972 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" Apr 17 20:22:04.071894 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:04.071855 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" start-of-body= Apr 17 20:22:04.072274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:04.071925 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.36:15021/healthz/ready\": dial tcp 10.133.0.36:15021: connect: connection refused" Apr 17 20:22:04.313726 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:04.313690 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" containerID="cri-o://081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499" gracePeriod=30 Apr 17 20:22:09.566014 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.565986 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:22:09.655582 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655551 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds56w\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.655886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655606 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.655886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655631 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.655886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655822 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655899 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655937 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655939 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.655993 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656017 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656060 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket\") pod \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\" (UID: \"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa\") " Apr 17 20:22:09.656363 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656074 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data" (OuterVolumeSpecName: "istio-data") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:09.656363 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656094 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:09.656363 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656334 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:09.656488 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656350 2579 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-data\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.656541 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656349 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:22:09.656541 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656485 2579 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-socket\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.656635 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.656560 2579 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-workload-certs\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.658245 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.658223 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w" (OuterVolumeSpecName: "kube-api-access-ds56w") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "kube-api-access-ds56w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:09.658346 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.658243 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:09.658655 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.658640 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token" (OuterVolumeSpecName: "istio-token") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:09.658720 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.658640 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" (UID: "f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 17 20:22:09.757554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757506 2579 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-envoy\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.757554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757551 2579 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-podinfo\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.757554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757561 2579 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istiod-ca-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.757554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757570 2579 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-istio-token\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.757876 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757578 2579 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-credential-socket\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:09.757876 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:09.757587 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ds56w\" (UniqueName: \"kubernetes.io/projected/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa-kube-api-access-ds56w\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:10.336582 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.336549 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerID="081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499" exitCode=0 Apr 17 20:22:10.336823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.336598 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" event={"ID":"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa","Type":"ContainerDied","Data":"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499"} Apr 17 20:22:10.336823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.336619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" event={"ID":"f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa","Type":"ContainerDied","Data":"4cf9b7d5311a2626bee84236ad1a06010d386837e8a782daaa17af5a8e448516"} Apr 17 20:22:10.336823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.336627 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm" Apr 17 20:22:10.336823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.336640 2579 scope.go:117] "RemoveContainer" containerID="081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499" Apr 17 20:22:10.346304 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.346287 2579 scope.go:117] "RemoveContainer" containerID="081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499" Apr 17 20:22:10.346571 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:22:10.346552 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499\": container with ID starting with 081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499 not found: ID does not exist" containerID="081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499" Apr 17 20:22:10.346614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.346581 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499"} err="failed to get container status \"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499\": rpc error: code = NotFound desc = could not find container \"081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499\": container with ID starting with 081d25dec45284680c1c7b6b13232599b4a73f6a588f6ccb07e83dcd8ec52499 not found: ID does not exist" Apr 17 20:22:10.357800 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.357775 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:22:10.363397 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.363371 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rxjlm"] Apr 17 20:22:10.881214 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:10.881177 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" path="/var/lib/kubelet/pods/f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa/volumes" Apr 17 20:22:34.252239 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.252192 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:34.252818 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.252769 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" Apr 17 20:22:34.252818 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.252792 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" Apr 17 20:22:34.252963 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.252885 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a9ee71-da70-4ec3-b8af-32f87e1ce1aa" containerName="istio-proxy" Apr 17 20:22:34.258246 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.258221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:34.260879 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.260813 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:22:34.261774 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.261731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-tnhzl\"" Apr 17 20:22:34.262083 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.261781 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:22:34.262458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.262433 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:34.378191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.378157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72md\" (UniqueName: \"kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md\") pod \"kuadrant-operator-catalog-dqf78\" (UID: \"387fb55d-ec1b-45df-a264-2084da6342e1\") " pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:34.478756 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.478718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g72md\" (UniqueName: \"kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md\") pod \"kuadrant-operator-catalog-dqf78\" (UID: \"387fb55d-ec1b-45df-a264-2084da6342e1\") " pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:34.487984 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.487947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72md\" (UniqueName: \"kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md\") pod \"kuadrant-operator-catalog-dqf78\" (UID: \"387fb55d-ec1b-45df-a264-2084da6342e1\") " pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:34.569126 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.569036 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:34.623463 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.623423 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:34.692578 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.692552 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:34.694562 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:34.694534 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387fb55d_ec1b_45df_a264_2084da6342e1.slice/crio-e05394018ebc4ec6331bc5de81f72461200ff8a8a2271e93fb7e13dbe92c1ab6 WatchSource:0}: Error finding container e05394018ebc4ec6331bc5de81f72461200ff8a8a2271e93fb7e13dbe92c1ab6: Status 404 returned error can't find the container with id e05394018ebc4ec6331bc5de81f72461200ff8a8a2271e93fb7e13dbe92c1ab6 Apr 17 20:22:34.830408 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.830327 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-stdpt"] Apr 17 20:22:34.833209 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.833191 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:34.839232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.839206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-stdpt"] Apr 17 20:22:34.983612 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:34.983576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wl7q\" (UniqueName: \"kubernetes.io/projected/7764d55c-5e37-47c3-9e21-f52d5eb3e0b3-kube-api-access-6wl7q\") pod \"kuadrant-operator-catalog-stdpt\" (UID: \"7764d55c-5e37-47c3-9e21-f52d5eb3e0b3\") " pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:35.084700 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.084589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl7q\" (UniqueName: \"kubernetes.io/projected/7764d55c-5e37-47c3-9e21-f52d5eb3e0b3-kube-api-access-6wl7q\") pod \"kuadrant-operator-catalog-stdpt\" (UID: \"7764d55c-5e37-47c3-9e21-f52d5eb3e0b3\") " pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:35.092956 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.092921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl7q\" (UniqueName: \"kubernetes.io/projected/7764d55c-5e37-47c3-9e21-f52d5eb3e0b3-kube-api-access-6wl7q\") pod \"kuadrant-operator-catalog-stdpt\" (UID: \"7764d55c-5e37-47c3-9e21-f52d5eb3e0b3\") " pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:35.143896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.143856 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:35.283204 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.283166 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-stdpt"] Apr 17 20:22:35.311247 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:35.311207 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7764d55c_5e37_47c3_9e21_f52d5eb3e0b3.slice/crio-487da6c743e074af406f1dbbdb14cc07b82b38e12d255f2066cbc6724733960f WatchSource:0}: Error finding container 487da6c743e074af406f1dbbdb14cc07b82b38e12d255f2066cbc6724733960f: Status 404 returned error can't find the container with id 487da6c743e074af406f1dbbdb14cc07b82b38e12d255f2066cbc6724733960f Apr 17 20:22:35.428082 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.428029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" event={"ID":"387fb55d-ec1b-45df-a264-2084da6342e1","Type":"ContainerStarted","Data":"e05394018ebc4ec6331bc5de81f72461200ff8a8a2271e93fb7e13dbe92c1ab6"} Apr 17 20:22:35.429198 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:35.429160 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" event={"ID":"7764d55c-5e37-47c3-9e21-f52d5eb3e0b3","Type":"ContainerStarted","Data":"487da6c743e074af406f1dbbdb14cc07b82b38e12d255f2066cbc6724733960f"} Apr 17 20:22:37.439121 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.439079 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" event={"ID":"7764d55c-5e37-47c3-9e21-f52d5eb3e0b3","Type":"ContainerStarted","Data":"4b96072afa6898244ebb32f8bc01e0cb020051d2a60d4fbb5c7ceb0a9e09d2ac"} Apr 17 20:22:37.440444 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.440413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" event={"ID":"387fb55d-ec1b-45df-a264-2084da6342e1","Type":"ContainerStarted","Data":"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d"} Apr 17 20:22:37.440585 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.440482 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" podUID="387fb55d-ec1b-45df-a264-2084da6342e1" containerName="registry-server" containerID="cri-o://84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d" gracePeriod=2 Apr 17 20:22:37.453311 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.453255 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" podStartSLOduration=2.065853079 podStartE2EDuration="3.453236794s" podCreationTimestamp="2026-04-17 20:22:34 +0000 UTC" firstStartedPulling="2026-04-17 20:22:35.312770212 +0000 UTC m=+424.977695890" lastFinishedPulling="2026-04-17 20:22:36.700153937 +0000 UTC m=+426.365079605" observedRunningTime="2026-04-17 20:22:37.452305091 +0000 UTC m=+427.117230773" watchObservedRunningTime="2026-04-17 20:22:37.453236794 +0000 UTC m=+427.118162485" Apr 17 20:22:37.466451 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.466403 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" podStartSLOduration=1.465285173 podStartE2EDuration="3.466389662s" podCreationTimestamp="2026-04-17 20:22:34 +0000 UTC" firstStartedPulling="2026-04-17 20:22:34.695866193 +0000 UTC m=+424.360791860" lastFinishedPulling="2026-04-17 20:22:36.696970681 +0000 UTC m=+426.361896349" observedRunningTime="2026-04-17 20:22:37.4647838 +0000 UTC m=+427.129709484" watchObservedRunningTime="2026-04-17 20:22:37.466389662 +0000 UTC m=+427.131315352" Apr 17 20:22:37.674668 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.674641 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:37.708196 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.708118 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g72md\" (UniqueName: \"kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md\") pod \"387fb55d-ec1b-45df-a264-2084da6342e1\" (UID: \"387fb55d-ec1b-45df-a264-2084da6342e1\") " Apr 17 20:22:37.710549 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.710517 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md" (OuterVolumeSpecName: "kube-api-access-g72md") pod "387fb55d-ec1b-45df-a264-2084da6342e1" (UID: "387fb55d-ec1b-45df-a264-2084da6342e1"). InnerVolumeSpecName "kube-api-access-g72md". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:37.809314 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:37.809271 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g72md\" (UniqueName: \"kubernetes.io/projected/387fb55d-ec1b-45df-a264-2084da6342e1-kube-api-access-g72md\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:38.445232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.445191 2579 generic.go:358] "Generic (PLEG): container finished" podID="387fb55d-ec1b-45df-a264-2084da6342e1" containerID="84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d" exitCode=0 Apr 17 20:22:38.445651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.445250 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" Apr 17 20:22:38.445651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.445272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" event={"ID":"387fb55d-ec1b-45df-a264-2084da6342e1","Type":"ContainerDied","Data":"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d"} Apr 17 20:22:38.445651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.445308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dqf78" event={"ID":"387fb55d-ec1b-45df-a264-2084da6342e1","Type":"ContainerDied","Data":"e05394018ebc4ec6331bc5de81f72461200ff8a8a2271e93fb7e13dbe92c1ab6"} Apr 17 20:22:38.445651 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.445326 2579 scope.go:117] "RemoveContainer" containerID="84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d" Apr 17 20:22:38.454696 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.454677 2579 scope.go:117] "RemoveContainer" containerID="84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d" Apr 17 20:22:38.454967 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:22:38.454945 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d\": container with ID starting with 84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d not found: ID does not exist" containerID="84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d" Apr 17 20:22:38.455032 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.454981 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d"} err="failed to get container status \"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d\": rpc error: code = NotFound desc = could not find container \"84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d\": container with ID starting with 84e8298dfb2e462e7d5b1c9b425d799580804267dae3c359330b342f197c987d not found: ID does not exist" Apr 17 20:22:38.469111 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.469058 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:38.473221 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.473193 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dqf78"] Apr 17 20:22:38.878445 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:38.878414 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387fb55d-ec1b-45df-a264-2084da6342e1" path="/var/lib/kubelet/pods/387fb55d-ec1b-45df-a264-2084da6342e1/volumes" Apr 17 20:22:45.144121 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:45.144067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:45.144121 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:45.144124 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:45.166717 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:45.166691 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:45.492582 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:45.492515 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-stdpt" Apr 17 20:22:49.858633 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.858545 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7"] Apr 17 20:22:49.859056 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.858948 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="387fb55d-ec1b-45df-a264-2084da6342e1" containerName="registry-server" Apr 17 20:22:49.859056 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.858963 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="387fb55d-ec1b-45df-a264-2084da6342e1" containerName="registry-server" Apr 17 20:22:49.859056 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.859036 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="387fb55d-ec1b-45df-a264-2084da6342e1" containerName="registry-server" Apr 17 20:22:49.861162 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.861146 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:49.863313 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.863291 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-dhzz5\"" Apr 17 20:22:49.876125 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.871682 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7"] Apr 17 20:22:49.912499 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.912454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:49.912671 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.912574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xjg\" (UniqueName: \"kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:49.912671 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:49.912601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.013148 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.013089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.013333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.013240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xjg\" (UniqueName: \"kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.013333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.013285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.013591 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.013568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.013627 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.013577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.021096 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.021072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xjg\" (UniqueName: \"kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.179166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.179128 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:50.303307 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.303267 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7"] Apr 17 20:22:50.306792 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:50.306733 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d748162_4119_4a69_95dd_02301e4f557d.slice/crio-eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6 WatchSource:0}: Error finding container eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6: Status 404 returned error can't find the container with id eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6 Apr 17 20:22:50.481435 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.481354 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk"] Apr 17 20:22:50.483964 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.483946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.491688 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.491659 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk"] Apr 17 20:22:50.493307 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.493238 2579 generic.go:358] "Generic (PLEG): container finished" podID="7d748162-4119-4a69-95dd-02301e4f557d" containerID="e4adfb585a6825345b125d362f37f096e4e4796a69ab661e3c5ddc88f0f24b61" exitCode=0 Apr 17 20:22:50.493432 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.493317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" event={"ID":"7d748162-4119-4a69-95dd-02301e4f557d","Type":"ContainerDied","Data":"e4adfb585a6825345b125d362f37f096e4e4796a69ab661e3c5ddc88f0f24b61"} Apr 17 20:22:50.493432 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.493350 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" event={"ID":"7d748162-4119-4a69-95dd-02301e4f557d","Type":"ContainerStarted","Data":"eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6"} Apr 17 20:22:50.518119 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.518081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.518297 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.518135 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmq9t\" (UniqueName: \"kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.518297 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.518196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.619381 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.619342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.619578 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.619410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.619578 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.619442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmq9t\" (UniqueName: \"kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.619862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.619839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.619862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.619855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.626971 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.626947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmq9t\" (UniqueName: \"kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.795732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.795629 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:50.925440 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:50.925418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk"] Apr 17 20:22:50.927033 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:50.927004 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7529f4_cf46_4ef3_a0fb_2c7c129ac06a.slice/crio-f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4 WatchSource:0}: Error finding container f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4: Status 404 returned error can't find the container with id f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4 Apr 17 20:22:51.061915 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.061842 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r"] Apr 17 20:22:51.064229 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.064212 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.071762 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.071721 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r"] Apr 17 20:22:51.124797 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.124737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.124977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.124823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vm7\" (UniqueName: \"kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.124977 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.124863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.225418 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.225390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.225522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.225438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99vm7\" (UniqueName: \"kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.225587 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.225565 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.225806 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.225786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.225969 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.225907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.232904 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.232878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vm7\" (UniqueName: \"kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.404188 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.404156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:51.463255 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.463224 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww"] Apr 17 20:22:51.466421 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.466400 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.473814 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.473786 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww"] Apr 17 20:22:51.498802 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.498769 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerID="4e7d38ab11dde11477665141aafa8719e379c1adcd174b4cd99c6a06aefa876e" exitCode=0 Apr 17 20:22:51.498982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.498883 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" event={"ID":"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a","Type":"ContainerDied","Data":"4e7d38ab11dde11477665141aafa8719e379c1adcd174b4cd99c6a06aefa876e"} Apr 17 20:22:51.498982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.498911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" event={"ID":"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a","Type":"ContainerStarted","Data":"f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4"} Apr 17 20:22:51.500724 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.500700 2579 generic.go:358] "Generic (PLEG): container finished" podID="7d748162-4119-4a69-95dd-02301e4f557d" containerID="6a435e1c951953e8695ff635fbd36c2d49db76bfcd5d89010dbbebe1745f6408" exitCode=0 Apr 17 20:22:51.500876 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.500734 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" event={"ID":"7d748162-4119-4a69-95dd-02301e4f557d","Type":"ContainerDied","Data":"6a435e1c951953e8695ff635fbd36c2d49db76bfcd5d89010dbbebe1745f6408"} Apr 17 20:22:51.528212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.528171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzwl\" (UniqueName: \"kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.528872 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.528840 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.528991 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.528952 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.538198 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.538171 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r"] Apr 17 20:22:51.548562 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:51.548535 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb665cb0_45bb_47ff_91cc_a45213b8e5b3.slice/crio-bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142 WatchSource:0}: Error finding container bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142: Status 404 returned error can't find the container with id bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142 Apr 17 20:22:51.629692 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.629658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.629877 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.629718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.629877 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.629804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzwl\" (UniqueName: \"kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.630240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.630222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.630274 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.630219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.639264 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.639239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzwl\" (UniqueName: \"kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.777922 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.777883 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:51.904823 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:51.904793 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww"] Apr 17 20:22:51.907406 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:22:51.907369 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76972add_7656_4d60_85e0_b01a86f15425.slice/crio-1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb WatchSource:0}: Error finding container 1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb: Status 404 returned error can't find the container with id 1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb Apr 17 20:22:52.506500 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.506418 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerID="c10370cec5b3e723fd9a2dd72e78ced9f8fa7a63fde231b593cda9b68a3ab823" exitCode=0 Apr 17 20:22:52.506961 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.506510 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" event={"ID":"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a","Type":"ContainerDied","Data":"c10370cec5b3e723fd9a2dd72e78ced9f8fa7a63fde231b593cda9b68a3ab823"} Apr 17 20:22:52.507877 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.507853 2579 generic.go:358] "Generic (PLEG): container finished" podID="76972add-7656-4d60-85e0-b01a86f15425" containerID="adfe5bc81835b6f243ddf42e73ed66ab130c40d8d5f79d98ef8410b83af8cfe4" exitCode=0 Apr 17 20:22:52.507994 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.507936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" event={"ID":"76972add-7656-4d60-85e0-b01a86f15425","Type":"ContainerDied","Data":"adfe5bc81835b6f243ddf42e73ed66ab130c40d8d5f79d98ef8410b83af8cfe4"} Apr 17 20:22:52.507994 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.507969 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" event={"ID":"76972add-7656-4d60-85e0-b01a86f15425","Type":"ContainerStarted","Data":"1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb"} Apr 17 20:22:52.510019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.509992 2579 generic.go:358] "Generic (PLEG): container finished" podID="7d748162-4119-4a69-95dd-02301e4f557d" containerID="50f96723d50d288d24ffdf7e84285c6c8f5af731a8600f333df98a3f0bf04357" exitCode=0 Apr 17 20:22:52.510112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.510076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" event={"ID":"7d748162-4119-4a69-95dd-02301e4f557d","Type":"ContainerDied","Data":"50f96723d50d288d24ffdf7e84285c6c8f5af731a8600f333df98a3f0bf04357"} Apr 17 20:22:52.511425 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.511404 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerID="c2d71ab58c40246754faceb8f08213da33a77d5da9c75f2c05509cf604bbfe2d" exitCode=0 Apr 17 20:22:52.511502 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.511446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" event={"ID":"fb665cb0-45bb-47ff-91cc-a45213b8e5b3","Type":"ContainerDied","Data":"c2d71ab58c40246754faceb8f08213da33a77d5da9c75f2c05509cf604bbfe2d"} Apr 17 20:22:52.511502 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:52.511463 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" event={"ID":"fb665cb0-45bb-47ff-91cc-a45213b8e5b3","Type":"ContainerStarted","Data":"bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142"} Apr 17 20:22:53.517319 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.517284 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerID="351a443787371d32996d384f3de1d03b4945dac3418b22cc11be1a472c8e5258" exitCode=0 Apr 17 20:22:53.517797 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.517340 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" event={"ID":"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a","Type":"ContainerDied","Data":"351a443787371d32996d384f3de1d03b4945dac3418b22cc11be1a472c8e5258"} Apr 17 20:22:53.518912 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.518887 2579 generic.go:358] "Generic (PLEG): container finished" podID="76972add-7656-4d60-85e0-b01a86f15425" containerID="7e4022a14520ed1fa433f8926c604484c88cebc6d4bbe3524587b042457b5182" exitCode=0 Apr 17 20:22:53.519015 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.518939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" event={"ID":"76972add-7656-4d60-85e0-b01a86f15425","Type":"ContainerDied","Data":"7e4022a14520ed1fa433f8926c604484c88cebc6d4bbe3524587b042457b5182"} Apr 17 20:22:53.520692 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.520619 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerID="670cd2d9413ec2b3bedce9e933fae5c17bc6d02cdd2b2f980933ef6349cea45a" exitCode=0 Apr 17 20:22:53.520776 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.520702 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" event={"ID":"fb665cb0-45bb-47ff-91cc-a45213b8e5b3","Type":"ContainerDied","Data":"670cd2d9413ec2b3bedce9e933fae5c17bc6d02cdd2b2f980933ef6349cea45a"} Apr 17 20:22:53.644988 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.644967 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:53.748304 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.748271 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xjg\" (UniqueName: \"kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg\") pod \"7d748162-4119-4a69-95dd-02301e4f557d\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " Apr 17 20:22:53.748445 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.748325 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle\") pod \"7d748162-4119-4a69-95dd-02301e4f557d\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " Apr 17 20:22:53.748445 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.748373 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util\") pod \"7d748162-4119-4a69-95dd-02301e4f557d\" (UID: \"7d748162-4119-4a69-95dd-02301e4f557d\") " Apr 17 20:22:53.748970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.748944 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle" (OuterVolumeSpecName: "bundle") pod "7d748162-4119-4a69-95dd-02301e4f557d" (UID: "7d748162-4119-4a69-95dd-02301e4f557d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:53.750453 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.750431 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg" (OuterVolumeSpecName: "kube-api-access-p8xjg") pod "7d748162-4119-4a69-95dd-02301e4f557d" (UID: "7d748162-4119-4a69-95dd-02301e4f557d"). InnerVolumeSpecName "kube-api-access-p8xjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:53.753886 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.753867 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util" (OuterVolumeSpecName: "util") pod "7d748162-4119-4a69-95dd-02301e4f557d" (UID: "7d748162-4119-4a69-95dd-02301e4f557d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:53.849643 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.849591 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8xjg\" (UniqueName: \"kubernetes.io/projected/7d748162-4119-4a69-95dd-02301e4f557d-kube-api-access-p8xjg\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:53.849643 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.849639 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:53.849643 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:53.849649 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d748162-4119-4a69-95dd-02301e4f557d-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:54.527407 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.527372 2579 generic.go:358] "Generic (PLEG): container finished" podID="76972add-7656-4d60-85e0-b01a86f15425" containerID="7788a8bdb1d5bc1e3a182f3948382b8bca26c6873454ce76c240ac458a14a0c0" exitCode=0 Apr 17 20:22:54.527872 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.527452 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" event={"ID":"76972add-7656-4d60-85e0-b01a86f15425","Type":"ContainerDied","Data":"7788a8bdb1d5bc1e3a182f3948382b8bca26c6873454ce76c240ac458a14a0c0"} Apr 17 20:22:54.529201 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.529173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" event={"ID":"7d748162-4119-4a69-95dd-02301e4f557d","Type":"ContainerDied","Data":"eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6"} Apr 17 20:22:54.529201 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.529197 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7" Apr 17 20:22:54.529396 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.529205 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb14f0824dd8de2582371f8da8057a432c244963b81d7fd4a79bfddd6905f4f6" Apr 17 20:22:54.530999 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.530974 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerID="ddb98709d6fc6fca7f0fff9e059576996a847218d40f916807657e33e5eeb0b8" exitCode=0 Apr 17 20:22:54.531127 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.531029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" event={"ID":"fb665cb0-45bb-47ff-91cc-a45213b8e5b3","Type":"ContainerDied","Data":"ddb98709d6fc6fca7f0fff9e059576996a847218d40f916807657e33e5eeb0b8"} Apr 17 20:22:54.658530 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.658506 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:54.758204 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.758165 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmq9t\" (UniqueName: \"kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t\") pod \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " Apr 17 20:22:54.758370 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.758215 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util\") pod \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " Apr 17 20:22:54.758370 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.758327 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle\") pod \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\" (UID: \"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a\") " Apr 17 20:22:54.758921 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.758822 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle" (OuterVolumeSpecName: "bundle") pod "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" (UID: "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:54.760446 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.760425 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t" (OuterVolumeSpecName: "kube-api-access-hmq9t") pod "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" (UID: "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a"). InnerVolumeSpecName "kube-api-access-hmq9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:54.765275 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.765248 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util" (OuterVolumeSpecName: "util") pod "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" (UID: "9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:54.859529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.859434 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:54.859529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.859467 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmq9t\" (UniqueName: \"kubernetes.io/projected/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-kube-api-access-hmq9t\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:54.859529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:54.859478 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.537322 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.537291 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" Apr 17 20:22:55.537678 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.537284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk" event={"ID":"9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a","Type":"ContainerDied","Data":"f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4"} Apr 17 20:22:55.537678 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.537411 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b67ddad7b3385bdf0c3296fb3da659c4c64fcaecd43b8a8f9c66b13c3ee7e4" Apr 17 20:22:55.672553 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.672528 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:22:55.703354 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.703330 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:55.770435 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770393 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vm7\" (UniqueName: \"kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7\") pod \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " Apr 17 20:22:55.770435 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770435 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle\") pod \"76972add-7656-4d60-85e0-b01a86f15425\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " Apr 17 20:22:55.770643 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770465 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tzwl\" (UniqueName: \"kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl\") pod \"76972add-7656-4d60-85e0-b01a86f15425\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " Apr 17 20:22:55.770678 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770657 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle\") pod \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " Apr 17 20:22:55.770726 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770711 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util\") pod \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\" (UID: \"fb665cb0-45bb-47ff-91cc-a45213b8e5b3\") " Apr 17 20:22:55.770795 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.770739 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util\") pod \"76972add-7656-4d60-85e0-b01a86f15425\" (UID: \"76972add-7656-4d60-85e0-b01a86f15425\") " Apr 17 20:22:55.771052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.771020 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle" (OuterVolumeSpecName: "bundle") pod "76972add-7656-4d60-85e0-b01a86f15425" (UID: "76972add-7656-4d60-85e0-b01a86f15425"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:55.771492 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.771462 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle" (OuterVolumeSpecName: "bundle") pod "fb665cb0-45bb-47ff-91cc-a45213b8e5b3" (UID: "fb665cb0-45bb-47ff-91cc-a45213b8e5b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:55.772991 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.772954 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7" (OuterVolumeSpecName: "kube-api-access-99vm7") pod "fb665cb0-45bb-47ff-91cc-a45213b8e5b3" (UID: "fb665cb0-45bb-47ff-91cc-a45213b8e5b3"). InnerVolumeSpecName "kube-api-access-99vm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:55.773086 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.772996 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl" (OuterVolumeSpecName: "kube-api-access-9tzwl") pod "76972add-7656-4d60-85e0-b01a86f15425" (UID: "76972add-7656-4d60-85e0-b01a86f15425"). InnerVolumeSpecName "kube-api-access-9tzwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:22:55.776389 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.776363 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util" (OuterVolumeSpecName: "util") pod "76972add-7656-4d60-85e0-b01a86f15425" (UID: "76972add-7656-4d60-85e0-b01a86f15425"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:55.776486 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.776396 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util" (OuterVolumeSpecName: "util") pod "fb665cb0-45bb-47ff-91cc-a45213b8e5b3" (UID: "fb665cb0-45bb-47ff-91cc-a45213b8e5b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872177 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872222 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872232 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-util\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872240 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99vm7\" (UniqueName: \"kubernetes.io/projected/fb665cb0-45bb-47ff-91cc-a45213b8e5b3-kube-api-access-99vm7\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872250 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76972add-7656-4d60-85e0-b01a86f15425-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:55.872281 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:55.872259 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tzwl\" (UniqueName: \"kubernetes.io/projected/76972add-7656-4d60-85e0-b01a86f15425-kube-api-access-9tzwl\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:22:56.543580 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.543551 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" Apr 17 20:22:56.544083 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.543552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww" event={"ID":"76972add-7656-4d60-85e0-b01a86f15425","Type":"ContainerDied","Data":"1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb"} Apr 17 20:22:56.544083 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.543654 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf9edf6fb7b4382550aa97dc8f0e67468a1cd58591353787e82ac032e70baeb" Apr 17 20:22:56.545254 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.545228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" event={"ID":"fb665cb0-45bb-47ff-91cc-a45213b8e5b3","Type":"ContainerDied","Data":"bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142"} Apr 17 20:22:56.545254 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.545253 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbdf79088bcf13e4c3ecb2b3de375700b1260549230990c02a0c873247ab5142" Apr 17 20:22:56.545405 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:22:56.545276 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r" Apr 17 20:23:04.103567 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.103526 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b7744d46-4q2t4"] Apr 17 20:23:04.104011 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.103987 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="extract" Apr 17 20:23:04.104011 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104011 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="extract" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104024 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="util" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104029 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="util" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104038 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="util" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104046 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="util" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104055 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="pull" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104063 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="pull" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104098 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="pull" Apr 17 20:23:04.104108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104104 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="pull" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104113 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104118 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104131 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="pull" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104136 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="pull" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104143 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104149 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104169 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="util" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104175 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="util" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104182 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104187 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104194 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="util" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104199 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="util" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104204 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="pull" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104209 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="pull" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104289 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104299 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d748162-4119-4a69-95dd-02301e4f557d" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104305 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="76972add-7656-4d60-85e0-b01a86f15425" containerName="extract" Apr 17 20:23:04.104333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.104311 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb665cb0-45bb-47ff-91cc-a45213b8e5b3" containerName="extract" Apr 17 20:23:04.109247 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.109222 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.115246 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.115217 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b7744d46-4q2t4"] Apr 17 20:23:04.244535 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244498 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-service-ca\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244535 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-oauth-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244789 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244789 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6klc\" (UniqueName: \"kubernetes.io/projected/8bea4f82-eaa0-42f1-8856-45a27b083b22-kube-api-access-f6klc\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244789 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-trusted-ca-bundle\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244789 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.244929 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.244848 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-oauth-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345468 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6klc\" (UniqueName: \"kubernetes.io/projected/8bea4f82-eaa0-42f1-8856-45a27b083b22-kube-api-access-f6klc\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345468 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345472 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-trusted-ca-bundle\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-oauth-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-service-ca\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-oauth-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.345835 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.345724 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.346453 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.346423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-oauth-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.346553 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.346479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-trusted-ca-bundle\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.346635 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.346612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.346917 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.346894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bea4f82-eaa0-42f1-8856-45a27b083b22-service-ca\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.348352 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.348323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-serving-cert\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.348352 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.348343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bea4f82-eaa0-42f1-8856-45a27b083b22-console-oauth-config\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.353191 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.353170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6klc\" (UniqueName: \"kubernetes.io/projected/8bea4f82-eaa0-42f1-8856-45a27b083b22-kube-api-access-f6klc\") pod \"console-64b7744d46-4q2t4\" (UID: \"8bea4f82-eaa0-42f1-8856-45a27b083b22\") " pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.422773 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.422700 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:04.555312 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.555287 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b7744d46-4q2t4"] Apr 17 20:23:04.557259 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:04.557226 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bea4f82_eaa0_42f1_8856_45a27b083b22.slice/crio-428c0a4ecad4adbc0d0275e1dd1e48130c59ef36458402eb178179b14a9de4c4 WatchSource:0}: Error finding container 428c0a4ecad4adbc0d0275e1dd1e48130c59ef36458402eb178179b14a9de4c4: Status 404 returned error can't find the container with id 428c0a4ecad4adbc0d0275e1dd1e48130c59ef36458402eb178179b14a9de4c4 Apr 17 20:23:04.588216 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:04.588177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b7744d46-4q2t4" event={"ID":"8bea4f82-eaa0-42f1-8856-45a27b083b22","Type":"ContainerStarted","Data":"428c0a4ecad4adbc0d0275e1dd1e48130c59ef36458402eb178179b14a9de4c4"} Apr 17 20:23:05.593415 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.593377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b7744d46-4q2t4" event={"ID":"8bea4f82-eaa0-42f1-8856-45a27b083b22","Type":"ContainerStarted","Data":"bae91151bc609f41d14139110129a56f3b570b91531da4779681f10c1742d16e"} Apr 17 20:23:05.611939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.611886 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b7744d46-4q2t4" podStartSLOduration=1.611860587 podStartE2EDuration="1.611860587s" podCreationTimestamp="2026-04-17 20:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:23:05.60854103 +0000 UTC m=+455.273466910" watchObservedRunningTime="2026-04-17 20:23:05.611860587 +0000 UTC m=+455.276786276" Apr 17 20:23:05.978138 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.978105 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn"] Apr 17 20:23:05.981717 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.981695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:05.983922 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.983904 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 20:23:05.983991 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.983950 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-f82ng\"" Apr 17 20:23:05.994423 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:05.994397 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn"] Apr 17 20:23:06.060724 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.060680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktcg\" (UniqueName: \"kubernetes.io/projected/7a177b04-5a6e-419d-8e30-76109480eb46-kube-api-access-bktcg\") pod \"dns-operator-controller-manager-648d5c98bc-v5hqn\" (UID: \"7a177b04-5a6e-419d-8e30-76109480eb46\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:06.161775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.161716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bktcg\" (UniqueName: \"kubernetes.io/projected/7a177b04-5a6e-419d-8e30-76109480eb46-kube-api-access-bktcg\") pod \"dns-operator-controller-manager-648d5c98bc-v5hqn\" (UID: \"7a177b04-5a6e-419d-8e30-76109480eb46\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:06.173502 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.173466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktcg\" (UniqueName: \"kubernetes.io/projected/7a177b04-5a6e-419d-8e30-76109480eb46-kube-api-access-bktcg\") pod \"dns-operator-controller-manager-648d5c98bc-v5hqn\" (UID: \"7a177b04-5a6e-419d-8e30-76109480eb46\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:06.293293 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.293192 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:06.438875 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.438844 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn"] Apr 17 20:23:06.442209 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:06.442179 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a177b04_5a6e_419d_8e30_76109480eb46.slice/crio-2d0a830b8ae280df839a798c0feb40f31140b53f8c3b72589d2f46c835a93732 WatchSource:0}: Error finding container 2d0a830b8ae280df839a798c0feb40f31140b53f8c3b72589d2f46c835a93732: Status 404 returned error can't find the container with id 2d0a830b8ae280df839a798c0feb40f31140b53f8c3b72589d2f46c835a93732 Apr 17 20:23:06.598260 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:06.598161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" event={"ID":"7a177b04-5a6e-419d-8e30-76109480eb46","Type":"ContainerStarted","Data":"2d0a830b8ae280df839a798c0feb40f31140b53f8c3b72589d2f46c835a93732"} Apr 17 20:23:09.611324 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:09.611285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" event={"ID":"7a177b04-5a6e-419d-8e30-76109480eb46","Type":"ContainerStarted","Data":"b31e380e3db7ab66683526f7fc91fae28e8bfd268f4413c5fca430ef9562f569"} Apr 17 20:23:09.611735 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:09.611410 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:09.630761 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:09.630696 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" podStartSLOduration=1.9787059359999999 podStartE2EDuration="4.63068121s" podCreationTimestamp="2026-04-17 20:23:05 +0000 UTC" firstStartedPulling="2026-04-17 20:23:06.444216972 +0000 UTC m=+456.109142643" lastFinishedPulling="2026-04-17 20:23:09.096192055 +0000 UTC m=+458.761117917" observedRunningTime="2026-04-17 20:23:09.627051213 +0000 UTC m=+459.291976914" watchObservedRunningTime="2026-04-17 20:23:09.63068121 +0000 UTC m=+459.295606900" Apr 17 20:23:11.404240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.404200 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6"] Apr 17 20:23:11.412428 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.412398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.415942 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.415903 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-sl7bv\"" Apr 17 20:23:11.431672 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.431648 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6"] Apr 17 20:23:11.512647 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.512612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.512647 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.512652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9bxv\" (UniqueName: \"kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.614098 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.614049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.614285 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.614111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9bxv\" (UniqueName: \"kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.614446 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.614425 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.622927 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.622902 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9bxv\" (UniqueName: \"kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.722198 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.722093 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:11.867914 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:11.867888 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6"] Apr 17 20:23:11.870477 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:11.870451 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea89f11_81bd_4658_9761_8284622dc37f.slice/crio-5fb8e8468dd5c9a4e427e2b348e289e54b6306b8476288474fc2f41b0de7bf28 WatchSource:0}: Error finding container 5fb8e8468dd5c9a4e427e2b348e289e54b6306b8476288474fc2f41b0de7bf28: Status 404 returned error can't find the container with id 5fb8e8468dd5c9a4e427e2b348e289e54b6306b8476288474fc2f41b0de7bf28 Apr 17 20:23:12.624828 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:12.624786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" event={"ID":"8ea89f11-81bd-4658-9761-8284622dc37f","Type":"ContainerStarted","Data":"5fb8e8468dd5c9a4e427e2b348e289e54b6306b8476288474fc2f41b0de7bf28"} Apr 17 20:23:14.423885 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:14.423833 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:14.423885 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:14.423881 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:14.428734 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:14.428705 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:14.639317 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:14.639288 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b7744d46-4q2t4" Apr 17 20:23:14.691729 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:14.691647 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:23:16.646656 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:16.646560 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" event={"ID":"8ea89f11-81bd-4658-9761-8284622dc37f","Type":"ContainerStarted","Data":"cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950"} Apr 17 20:23:16.647097 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:16.646672 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:16.666990 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:16.666939 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" podStartSLOduration=1.148719697 podStartE2EDuration="5.666923828s" podCreationTimestamp="2026-04-17 20:23:11 +0000 UTC" firstStartedPulling="2026-04-17 20:23:11.872995011 +0000 UTC m=+461.537920682" lastFinishedPulling="2026-04-17 20:23:16.391199135 +0000 UTC m=+466.056124813" observedRunningTime="2026-04-17 20:23:16.664792395 +0000 UTC m=+466.329718085" watchObservedRunningTime="2026-04-17 20:23:16.666923828 +0000 UTC m=+466.331849517" Apr 17 20:23:17.565488 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.565452 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z"] Apr 17 20:23:17.569148 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.569127 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:17.572457 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.572436 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-fkqvj\"" Apr 17 20:23:17.581362 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.581335 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z"] Apr 17 20:23:17.672365 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.672329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgstw\" (UniqueName: \"kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw\") pod \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" (UID: \"4cc0057e-8422-40fc-8e04-6d92533a76af\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:17.773408 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.773365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgstw\" (UniqueName: \"kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw\") pod \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" (UID: \"4cc0057e-8422-40fc-8e04-6d92533a76af\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:17.781570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.781537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgstw\" (UniqueName: \"kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw\") pod \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" (UID: \"4cc0057e-8422-40fc-8e04-6d92533a76af\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:17.881116 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:17.881072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:18.015995 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:18.015962 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z"] Apr 17 20:23:18.017535 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:18.017505 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc0057e_8422_40fc_8e04_6d92533a76af.slice/crio-07bcdcce003f8bc30c23fad95c7f0c0187bfdb6b14916d35b96360deb92955ba WatchSource:0}: Error finding container 07bcdcce003f8bc30c23fad95c7f0c0187bfdb6b14916d35b96360deb92955ba: Status 404 returned error can't find the container with id 07bcdcce003f8bc30c23fad95c7f0c0187bfdb6b14916d35b96360deb92955ba Apr 17 20:23:18.656421 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:18.656379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" event={"ID":"4cc0057e-8422-40fc-8e04-6d92533a76af","Type":"ContainerStarted","Data":"07bcdcce003f8bc30c23fad95c7f0c0187bfdb6b14916d35b96360deb92955ba"} Apr 17 20:23:20.618090 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:20.618053 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-v5hqn" Apr 17 20:23:20.666220 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:20.666188 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" event={"ID":"4cc0057e-8422-40fc-8e04-6d92533a76af","Type":"ContainerStarted","Data":"ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767"} Apr 17 20:23:20.666415 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:20.666290 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:20.683381 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:20.683315 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" podStartSLOduration=1.944053749 podStartE2EDuration="3.683299523s" podCreationTimestamp="2026-04-17 20:23:17 +0000 UTC" firstStartedPulling="2026-04-17 20:23:18.019582648 +0000 UTC m=+467.684508322" lastFinishedPulling="2026-04-17 20:23:19.758828425 +0000 UTC m=+469.423754096" observedRunningTime="2026-04-17 20:23:20.68012759 +0000 UTC m=+470.345053340" watchObservedRunningTime="2026-04-17 20:23:20.683299523 +0000 UTC m=+470.348225212" Apr 17 20:23:27.652465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:27.652436 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:29.204946 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.204908 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6"] Apr 17 20:23:29.205364 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.205153 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" containerName="manager" containerID="cri-o://cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950" gracePeriod=2 Apr 17 20:23:29.216852 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.216808 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6"] Apr 17 20:23:29.237074 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.237045 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:29.237796 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.237769 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" containerName="manager" Apr 17 20:23:29.237796 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.237797 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" containerName="manager" Apr 17 20:23:29.237970 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.237894 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" containerName="manager" Apr 17 20:23:29.241337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.241315 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z"] Apr 17 20:23:29.241606 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.241575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.242474 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.241808 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" containerName="manager" containerID="cri-o://ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767" gracePeriod=2 Apr 17 20:23:29.243563 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.243543 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:29.243952 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.243928 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.244614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.244590 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z"] Apr 17 20:23:29.245640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.245607 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.247531 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.247502 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.257402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.257358 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:29.264735 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.264707 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj"] Apr 17 20:23:29.265312 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.265291 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" containerName="manager" Apr 17 20:23:29.265374 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.265314 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" containerName="manager" Apr 17 20:23:29.265420 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.265381 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" containerName="manager" Apr 17 20:23:29.268496 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.268476 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:29.274989 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.274787 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.277721 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.277692 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.279528 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.279487 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.280723 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.280699 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj"] Apr 17 20:23:29.380527 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.380495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkq2\" (UniqueName: \"kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.380703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.380547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6sx8\" (UniqueName: \"kubernetes.io/projected/7ebba3e7-ce6a-4679-a085-b93859f475f8-kube-api-access-s6sx8\") pod \"limitador-operator-controller-manager-85c4996f8c-2nqtj\" (UID: \"7ebba3e7-ce6a-4679-a085-b93859f475f8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:29.380703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.380580 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.481356 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.481328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkq2\" (UniqueName: \"kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.481484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.481386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6sx8\" (UniqueName: \"kubernetes.io/projected/7ebba3e7-ce6a-4679-a085-b93859f475f8-kube-api-access-s6sx8\") pod \"limitador-operator-controller-manager-85c4996f8c-2nqtj\" (UID: \"7ebba3e7-ce6a-4679-a085-b93859f475f8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:29.481484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.481442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.481862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.481837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.485071 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.485051 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:29.486920 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.486895 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.488486 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.488453 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.488684 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.488665 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:29.489900 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.489874 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6sx8\" (UniqueName: \"kubernetes.io/projected/7ebba3e7-ce6a-4679-a085-b93859f475f8-kube-api-access-s6sx8\") pod \"limitador-operator-controller-manager-85c4996f8c-2nqtj\" (UID: \"7ebba3e7-ce6a-4679-a085-b93859f475f8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:29.489991 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.489915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkq2\" (UniqueName: \"kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gspkb\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.490481 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.490459 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.492074 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.492054 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.582398 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.582357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume\") pod \"8ea89f11-81bd-4658-9761-8284622dc37f\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " Apr 17 20:23:29.582579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.582427 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgstw\" (UniqueName: \"kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw\") pod \"4cc0057e-8422-40fc-8e04-6d92533a76af\" (UID: \"4cc0057e-8422-40fc-8e04-6d92533a76af\") " Apr 17 20:23:29.582579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.582523 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9bxv\" (UniqueName: \"kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv\") pod \"8ea89f11-81bd-4658-9761-8284622dc37f\" (UID: \"8ea89f11-81bd-4658-9761-8284622dc37f\") " Apr 17 20:23:29.582986 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.582953 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8ea89f11-81bd-4658-9761-8284622dc37f" (UID: "8ea89f11-81bd-4658-9761-8284622dc37f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:23:29.584760 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.584719 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw" (OuterVolumeSpecName: "kube-api-access-bgstw") pod "4cc0057e-8422-40fc-8e04-6d92533a76af" (UID: "4cc0057e-8422-40fc-8e04-6d92533a76af"). InnerVolumeSpecName "kube-api-access-bgstw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:23:29.584760 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.584736 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv" (OuterVolumeSpecName: "kube-api-access-t9bxv") pod "8ea89f11-81bd-4658-9761-8284622dc37f" (UID: "8ea89f11-81bd-4658-9761-8284622dc37f"). InnerVolumeSpecName "kube-api-access-t9bxv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:23:29.657846 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.657810 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:29.665103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.665076 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:29.683316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.683284 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9bxv\" (UniqueName: \"kubernetes.io/projected/8ea89f11-81bd-4658-9761-8284622dc37f-kube-api-access-t9bxv\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:29.683316 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.683315 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8ea89f11-81bd-4658-9761-8284622dc37f-extensions-socket-volume\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:29.683512 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.683327 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgstw\" (UniqueName: \"kubernetes.io/projected/4cc0057e-8422-40fc-8e04-6d92533a76af-kube-api-access-bgstw\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:29.703604 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.703570 2579 generic.go:358] "Generic (PLEG): container finished" podID="8ea89f11-81bd-4658-9761-8284622dc37f" containerID="cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950" exitCode=0 Apr 17 20:23:29.703807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.703660 2579 scope.go:117] "RemoveContainer" containerID="cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950" Apr 17 20:23:29.703807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.703671 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" Apr 17 20:23:29.705387 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.705314 2579 generic.go:358] "Generic (PLEG): container finished" podID="4cc0057e-8422-40fc-8e04-6d92533a76af" containerID="ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767" exitCode=0 Apr 17 20:23:29.705812 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.705721 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.706292 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.706268 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" Apr 17 20:23:29.707450 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.707419 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.713732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.713540 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.716105 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.716075 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.719914 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.719195 2579 scope.go:117] "RemoveContainer" containerID="cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950" Apr 17 20:23:29.721177 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:23:29.720479 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950\": container with ID starting with cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950 not found: ID does not exist" containerID="cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950" Apr 17 20:23:29.721177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.720517 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950"} err="failed to get container status \"cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950\": rpc error: code = NotFound desc = could not find container \"cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950\": container with ID starting with cc6c4af4e8b4e57352288b2bda09359ea7675b029e2bc35b1e3cf10c7146c950 not found: ID does not exist" Apr 17 20:23:29.721177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.720542 2579 scope.go:117] "RemoveContainer" containerID="ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767" Apr 17 20:23:29.727570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.726824 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.728737 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.728518 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.730569 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.730280 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.731962 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.731917 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:29.740492 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.740469 2579 scope.go:117] "RemoveContainer" containerID="ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767" Apr 17 20:23:29.740934 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:23:29.740911 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767\": container with ID starting with ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767 not found: ID does not exist" containerID="ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767" Apr 17 20:23:29.741035 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.740940 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767"} err="failed to get container status \"ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767\": rpc error: code = NotFound desc = could not find container \"ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767\": container with ID starting with ba2c61f8eb3ee51e77ea6c49380d958b3806fe082b39c25058ac25c085ea6767 not found: ID does not exist" Apr 17 20:23:29.802265 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.802233 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:29.805368 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:29.805329 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaee2bbe_8d33_4910_ad59_5f4acb58a896.slice/crio-12b3747958ff11587414339872559769db7703ec964f351b591f3b9e58ec9954 WatchSource:0}: Error finding container 12b3747958ff11587414339872559769db7703ec964f351b591f3b9e58ec9954: Status 404 returned error can't find the container with id 12b3747958ff11587414339872559769db7703ec964f351b591f3b9e58ec9954 Apr 17 20:23:29.834135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:29.834114 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj"] Apr 17 20:23:29.836673 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:23:29.836644 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ebba3e7_ce6a_4679_a085_b93859f475f8.slice/crio-4a334dd3cbef46395c71d49a56976a637629a04ffd72dd81fca9baa7711b55db WatchSource:0}: Error finding container 4a334dd3cbef46395c71d49a56976a637629a04ffd72dd81fca9baa7711b55db: Status 404 returned error can't find the container with id 4a334dd3cbef46395c71d49a56976a637629a04ffd72dd81fca9baa7711b55db Apr 17 20:23:30.714392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.714357 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" event={"ID":"7ebba3e7-ce6a-4679-a085-b93859f475f8","Type":"ContainerStarted","Data":"35506cd5d21f0a2d296c2ecbe601bc492fde47a616502a46691ec99f2fe29dad"} Apr 17 20:23:30.714831 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.714399 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" event={"ID":"7ebba3e7-ce6a-4679-a085-b93859f475f8","Type":"ContainerStarted","Data":"4a334dd3cbef46395c71d49a56976a637629a04ffd72dd81fca9baa7711b55db"} Apr 17 20:23:30.714831 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.714492 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:30.715865 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.715840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" event={"ID":"daee2bbe-8d33-4910-ad59-5f4acb58a896","Type":"ContainerStarted","Data":"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029"} Apr 17 20:23:30.715978 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.715871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" event={"ID":"daee2bbe-8d33-4910-ad59-5f4acb58a896","Type":"ContainerStarted","Data":"12b3747958ff11587414339872559769db7703ec964f351b591f3b9e58ec9954"} Apr 17 20:23:30.715978 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.715902 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:30.716831 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.716810 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:30.719458 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.719436 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:30.737212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.737171 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:30.737943 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.737905 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" podStartSLOduration=1.7378926670000001 podStartE2EDuration="1.737892667s" podCreationTimestamp="2026-04-17 20:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:23:30.735409614 +0000 UTC m=+480.400335315" watchObservedRunningTime="2026-04-17 20:23:30.737892667 +0000 UTC m=+480.402818357" Apr 17 20:23:30.754534 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.754491 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:30.755161 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.755116 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" podStartSLOduration=1.755099067 podStartE2EDuration="1.755099067s" podCreationTimestamp="2026-04-17 20:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:23:30.752775305 +0000 UTC m=+480.417700994" watchObservedRunningTime="2026-04-17 20:23:30.755099067 +0000 UTC m=+480.420024761" Apr 17 20:23:30.878065 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.878027 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" path="/var/lib/kubelet/pods/4cc0057e-8422-40fc-8e04-6d92533a76af/volumes" Apr 17 20:23:30.878214 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.878095 2579 status_manager.go:895] "Failed to get status for pod" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-c6dg6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-c6dg6\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:30.878476 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.878461 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea89f11-81bd-4658-9761-8284622dc37f" path="/var/lib/kubelet/pods/8ea89f11-81bd-4658-9761-8284622dc37f/volumes" Apr 17 20:23:30.879967 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:30.879947 2579 status_manager.go:895] "Failed to get status for pod" podUID="4cc0057e-8422-40fc-8e04-6d92533a76af" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xsh9z" err="pods \"limitador-operator-controller-manager-85c4996f8c-xsh9z\" is forbidden: User \"system:node:ip-10-0-139-2.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-2.ec2.internal' and this object" Apr 17 20:23:39.719349 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.719291 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bcb7f564c-6w4fw" podUID="09875294-d8e4-4034-ae11-1838d7158d64" containerName="console" containerID="cri-o://81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f" gracePeriod=15 Apr 17 20:23:39.967602 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.967580 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcb7f564c-6w4fw_09875294-d8e4-4034-ae11-1838d7158d64/console/0.log" Apr 17 20:23:39.967775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.967688 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:23:39.971016 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.970959 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:39.971016 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971000 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw4wp\" (UniqueName: \"kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:39.971177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971018 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:39.971177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971042 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:39.971177 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971069 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:39.971600 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971360 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:23:39.971600 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971506 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:23:39.971600 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.971517 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config" (OuterVolumeSpecName: "console-config") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:23:39.973355 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.973325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp" (OuterVolumeSpecName: "kube-api-access-cw4wp") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "kube-api-access-cw4wp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:23:39.973462 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:39.973400 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:23:40.072246 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072206 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:40.072429 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072255 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca\") pod \"09875294-d8e4-4034-ae11-1838d7158d64\" (UID: \"09875294-d8e4-4034-ae11-1838d7158d64\") " Apr 17 20:23:40.072484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072472 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-oauth-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.072522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072487 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cw4wp\" (UniqueName: \"kubernetes.io/projected/09875294-d8e4-4034-ae11-1838d7158d64-kube-api-access-cw4wp\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.072522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072497 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-trusted-ca-bundle\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.072522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072507 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-serving-cert\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.072522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072515 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-console-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.072642 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.072604 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca" (OuterVolumeSpecName: "service-ca") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:23:40.074436 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.074413 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "09875294-d8e4-4034-ae11-1838d7158d64" (UID: "09875294-d8e4-4034-ae11-1838d7158d64"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:23:40.173375 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.173341 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09875294-d8e4-4034-ae11-1838d7158d64-console-oauth-config\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.173375 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.173368 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09875294-d8e4-4034-ae11-1838d7158d64-service-ca\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:40.757893 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.757861 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcb7f564c-6w4fw_09875294-d8e4-4034-ae11-1838d7158d64/console/0.log" Apr 17 20:23:40.758337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.757906 2579 generic.go:358] "Generic (PLEG): container finished" podID="09875294-d8e4-4034-ae11-1838d7158d64" containerID="81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f" exitCode=2 Apr 17 20:23:40.758337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.757968 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb7f564c-6w4fw" Apr 17 20:23:40.758337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.757968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb7f564c-6w4fw" event={"ID":"09875294-d8e4-4034-ae11-1838d7158d64","Type":"ContainerDied","Data":"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f"} Apr 17 20:23:40.758337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.758081 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb7f564c-6w4fw" event={"ID":"09875294-d8e4-4034-ae11-1838d7158d64","Type":"ContainerDied","Data":"6a5ac9b3655b738bed106a5fbf3eef6dbdba632ea2d658f4f51ace36d878f790"} Apr 17 20:23:40.758337 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.758102 2579 scope.go:117] "RemoveContainer" containerID="81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f" Apr 17 20:23:40.767550 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.767532 2579 scope.go:117] "RemoveContainer" containerID="81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f" Apr 17 20:23:40.767861 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:23:40.767836 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f\": container with ID starting with 81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f not found: ID does not exist" containerID="81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f" Apr 17 20:23:40.767935 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.767870 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f"} err="failed to get container status \"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f\": rpc error: code = NotFound desc = could not find container \"81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f\": container with ID starting with 81b2345e92810950ace88bd6a3e8e5484103e0760cab7ebca06736129106bc4f not found: ID does not exist" Apr 17 20:23:40.779255 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.779227 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:23:40.782661 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.782636 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bcb7f564c-6w4fw"] Apr 17 20:23:40.878406 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:40.878369 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09875294-d8e4-4034-ae11-1838d7158d64" path="/var/lib/kubelet/pods/09875294-d8e4-4034-ae11-1838d7158d64/volumes" Apr 17 20:23:41.722296 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:41.722267 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2nqtj" Apr 17 20:23:41.722478 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:41.722321 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:45.918519 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:45.918475 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:45.919023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:45.918801 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" podUID="daee2bbe-8d33-4910-ad59-5f4acb58a896" containerName="manager" containerID="cri-o://ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029" gracePeriod=10 Apr 17 20:23:46.166152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.166125 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:46.227987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.227909 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume\") pod \"daee2bbe-8d33-4910-ad59-5f4acb58a896\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " Apr 17 20:23:46.227987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.227970 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrkq2\" (UniqueName: \"kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2\") pod \"daee2bbe-8d33-4910-ad59-5f4acb58a896\" (UID: \"daee2bbe-8d33-4910-ad59-5f4acb58a896\") " Apr 17 20:23:46.228360 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.228336 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "daee2bbe-8d33-4910-ad59-5f4acb58a896" (UID: "daee2bbe-8d33-4910-ad59-5f4acb58a896"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:23:46.230194 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.230169 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2" (OuterVolumeSpecName: "kube-api-access-jrkq2") pod "daee2bbe-8d33-4910-ad59-5f4acb58a896" (UID: "daee2bbe-8d33-4910-ad59-5f4acb58a896"). InnerVolumeSpecName "kube-api-access-jrkq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:23:46.329129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.329092 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/daee2bbe-8d33-4910-ad59-5f4acb58a896-extensions-socket-volume\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:46.329129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.329122 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrkq2\" (UniqueName: \"kubernetes.io/projected/daee2bbe-8d33-4910-ad59-5f4acb58a896-kube-api-access-jrkq2\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:23:46.784959 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.784918 2579 generic.go:358] "Generic (PLEG): container finished" podID="daee2bbe-8d33-4910-ad59-5f4acb58a896" containerID="ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029" exitCode=0 Apr 17 20:23:46.785132 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.784989 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" Apr 17 20:23:46.785132 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.785001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" event={"ID":"daee2bbe-8d33-4910-ad59-5f4acb58a896","Type":"ContainerDied","Data":"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029"} Apr 17 20:23:46.785132 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.785042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb" event={"ID":"daee2bbe-8d33-4910-ad59-5f4acb58a896","Type":"ContainerDied","Data":"12b3747958ff11587414339872559769db7703ec964f351b591f3b9e58ec9954"} Apr 17 20:23:46.785132 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.785058 2579 scope.go:117] "RemoveContainer" containerID="ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029" Apr 17 20:23:46.794253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.794234 2579 scope.go:117] "RemoveContainer" containerID="ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029" Apr 17 20:23:46.794520 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:23:46.794501 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029\": container with ID starting with ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029 not found: ID does not exist" containerID="ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029" Apr 17 20:23:46.794560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.794530 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029"} err="failed to get container status \"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029\": rpc error: code = NotFound desc = could not find container \"ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029\": container with ID starting with ae1ee0a3fa7d5acbd5c0e15d11a1459d01313f66184c668fddeef26f1d5aa029 not found: ID does not exist" Apr 17 20:23:46.809795 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.809765 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:46.815388 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.815361 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gspkb"] Apr 17 20:23:46.878830 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:23:46.878797 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daee2bbe-8d33-4910-ad59-5f4acb58a896" path="/var/lib/kubelet/pods/daee2bbe-8d33-4910-ad59-5f4acb58a896/volumes" Apr 17 20:24:02.069184 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069139 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf"] Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069547 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09875294-d8e4-4034-ae11-1838d7158d64" containerName="console" Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069562 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="09875294-d8e4-4034-ae11-1838d7158d64" containerName="console" Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069585 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daee2bbe-8d33-4910-ad59-5f4acb58a896" containerName="manager" Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069590 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="daee2bbe-8d33-4910-ad59-5f4acb58a896" containerName="manager" Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069652 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="09875294-d8e4-4034-ae11-1838d7158d64" containerName="console" Apr 17 20:24:02.069895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.069665 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="daee2bbe-8d33-4910-ad59-5f4acb58a896" containerName="manager" Apr 17 20:24:02.075006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.074981 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.077209 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.077182 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-fv9nz\"" Apr 17 20:24:02.083433 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.083403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf"] Apr 17 20:24:02.163151 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e146b460-6bc7-439b-b7f1-513b847a73d3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163338 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163338 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163229 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gfj7\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-kube-api-access-5gfj7\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163338 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.163460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.163449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.264829 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e146b460-6bc7-439b-b7f1-513b847a73d3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.264829 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gfj7\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-kube-api-access-5gfj7\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.264979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265112 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265395 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.265710 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.265575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e146b460-6bc7-439b-b7f1-513b847a73d3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.267397 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.267378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.267735 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.267713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.273522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.273499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.273839 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.273818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gfj7\" (UniqueName: \"kubernetes.io/projected/e146b460-6bc7-439b-b7f1-513b847a73d3-kube-api-access-5gfj7\") pod \"maas-default-gateway-openshift-default-58b6f876-66qbf\" (UID: \"e146b460-6bc7-439b-b7f1-513b847a73d3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.388465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.388434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:02.522206 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.522173 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf"] Apr 17 20:24:02.523100 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:02.523073 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode146b460_6bc7_439b_b7f1_513b847a73d3.slice/crio-d51e44aa425340a6a95aee12568856418c55ae01578ac48fa8af254346afcacf WatchSource:0}: Error finding container d51e44aa425340a6a95aee12568856418c55ae01578ac48fa8af254346afcacf: Status 404 returned error can't find the container with id d51e44aa425340a6a95aee12568856418c55ae01578ac48fa8af254346afcacf Apr 17 20:24:02.525136 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.525105 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:24:02.525205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.525165 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:24:02.525205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.525192 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:24:02.850136 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.850040 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" event={"ID":"e146b460-6bc7-439b-b7f1-513b847a73d3","Type":"ContainerStarted","Data":"f46ba9b1bdfe08e48a8dd8b12d44e90a0be8528d72f328be9c91f467135b10ff"} Apr 17 20:24:02.850136 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.850083 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" event={"ID":"e146b460-6bc7-439b-b7f1-513b847a73d3","Type":"ContainerStarted","Data":"d51e44aa425340a6a95aee12568856418c55ae01578ac48fa8af254346afcacf"} Apr 17 20:24:02.869734 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:02.869685 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" podStartSLOduration=0.869668544 podStartE2EDuration="869.668544ms" podCreationTimestamp="2026-04-17 20:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:24:02.868720325 +0000 UTC m=+512.533646023" watchObservedRunningTime="2026-04-17 20:24:02.869668544 +0000 UTC m=+512.534594235" Apr 17 20:24:03.389658 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:03.389618 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:03.394580 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:03.394553 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:03.854428 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:03.854345 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:03.855392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:03.855374 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-66qbf" Apr 17 20:24:06.069702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.069618 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:06.073522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.073504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.075761 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.075725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-dhzz5\"" Apr 17 20:24:06.075975 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.075810 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 20:24:06.080434 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.080409 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:06.170289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.170254 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:06.201156 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.201124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.201347 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.201235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5bn\" (UniqueName: \"kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.302119 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.302079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5bn\" (UniqueName: \"kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.302334 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.302163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.302811 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.302788 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.312858 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.312828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5bn\" (UniqueName: \"kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn\") pod \"limitador-limitador-7d549b5b-6qrqs\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.385413 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.385366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:06.519295 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.519269 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:06.521489 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:06.521454 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24995d8b_44df_4f70_8094_382614d5f1a6.slice/crio-808293acd9d7b765044f00ed788b47f13485543dbe11848adc4e58db7a29519e WatchSource:0}: Error finding container 808293acd9d7b765044f00ed788b47f13485543dbe11848adc4e58db7a29519e: Status 404 returned error can't find the container with id 808293acd9d7b765044f00ed788b47f13485543dbe11848adc4e58db7a29519e Apr 17 20:24:06.867253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.867161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" event={"ID":"24995d8b-44df-4f70-8094-382614d5f1a6","Type":"ContainerStarted","Data":"808293acd9d7b765044f00ed788b47f13485543dbe11848adc4e58db7a29519e"} Apr 17 20:24:06.889985 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.889951 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:06.895796 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.895691 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:06.898425 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.898400 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-5m5m7\"" Apr 17 20:24:06.899656 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:06.899581 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:07.009045 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.009004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnljd\" (UniqueName: \"kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd\") pod \"authorino-f99f4b5cd-96676\" (UID: \"20d3e749-ffa2-4a00-8ad6-060ad64664aa\") " pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:07.075050 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.075002 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:07.078807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.078780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:07.080817 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.080791 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:07.110208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.110177 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnljd\" (UniqueName: \"kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd\") pod \"authorino-f99f4b5cd-96676\" (UID: \"20d3e749-ffa2-4a00-8ad6-060ad64664aa\") " pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:07.117987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.117920 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnljd\" (UniqueName: \"kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd\") pod \"authorino-f99f4b5cd-96676\" (UID: \"20d3e749-ffa2-4a00-8ad6-060ad64664aa\") " pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:07.206998 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.206952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:07.211215 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.211179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6bxk\" (UniqueName: \"kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk\") pod \"authorino-7498df8756-h4t68\" (UID: \"800f372e-8a0c-44a6-82ce-ee20094d9273\") " pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:07.313103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.312690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6bxk\" (UniqueName: \"kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk\") pod \"authorino-7498df8756-h4t68\" (UID: \"800f372e-8a0c-44a6-82ce-ee20094d9273\") " pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:07.344530 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.344308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6bxk\" (UniqueName: \"kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk\") pod \"authorino-7498df8756-h4t68\" (UID: \"800f372e-8a0c-44a6-82ce-ee20094d9273\") " pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:07.390169 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.390141 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:07.391320 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.391293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:07.392767 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:07.392691 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20d3e749_ffa2_4a00_8ad6_060ad64664aa.slice/crio-e575c68927fef3a9d5b84017b61653599427a5999c6dbeb2c0ac8ff6427e5984 WatchSource:0}: Error finding container e575c68927fef3a9d5b84017b61653599427a5999c6dbeb2c0ac8ff6427e5984: Status 404 returned error can't find the container with id e575c68927fef3a9d5b84017b61653599427a5999c6dbeb2c0ac8ff6427e5984 Apr 17 20:24:07.578889 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.578828 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:07.579997 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:07.579957 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod800f372e_8a0c_44a6_82ce_ee20094d9273.slice/crio-114ac02a9a3bf8e8a7d2f20cf0d4b52ac711338ca577fd52dc73be6b192e7e33 WatchSource:0}: Error finding container 114ac02a9a3bf8e8a7d2f20cf0d4b52ac711338ca577fd52dc73be6b192e7e33: Status 404 returned error can't find the container with id 114ac02a9a3bf8e8a7d2f20cf0d4b52ac711338ca577fd52dc73be6b192e7e33 Apr 17 20:24:07.873107 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.873052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-h4t68" event={"ID":"800f372e-8a0c-44a6-82ce-ee20094d9273","Type":"ContainerStarted","Data":"114ac02a9a3bf8e8a7d2f20cf0d4b52ac711338ca577fd52dc73be6b192e7e33"} Apr 17 20:24:07.874301 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:07.874274 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-96676" event={"ID":"20d3e749-ffa2-4a00-8ad6-060ad64664aa","Type":"ContainerStarted","Data":"e575c68927fef3a9d5b84017b61653599427a5999c6dbeb2c0ac8ff6427e5984"} Apr 17 20:24:10.888005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.887967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" event={"ID":"24995d8b-44df-4f70-8094-382614d5f1a6","Type":"ContainerStarted","Data":"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b"} Apr 17 20:24:10.888460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.888091 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:10.889497 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.889460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-h4t68" event={"ID":"800f372e-8a0c-44a6-82ce-ee20094d9273","Type":"ContainerStarted","Data":"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214"} Apr 17 20:24:10.890881 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.890862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-96676" event={"ID":"20d3e749-ffa2-4a00-8ad6-060ad64664aa","Type":"ContainerStarted","Data":"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb"} Apr 17 20:24:10.928400 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.928353 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-96676" podStartSLOduration=1.78790889 podStartE2EDuration="4.928338879s" podCreationTimestamp="2026-04-17 20:24:06 +0000 UTC" firstStartedPulling="2026-04-17 20:24:07.394435235 +0000 UTC m=+517.059360918" lastFinishedPulling="2026-04-17 20:24:10.534865227 +0000 UTC m=+520.199790907" observedRunningTime="2026-04-17 20:24:10.926115082 +0000 UTC m=+520.591040772" watchObservedRunningTime="2026-04-17 20:24:10.928338879 +0000 UTC m=+520.593264569" Apr 17 20:24:10.945732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.945663 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" podStartSLOduration=0.987600803 podStartE2EDuration="4.945644448s" podCreationTimestamp="2026-04-17 20:24:06 +0000 UTC" firstStartedPulling="2026-04-17 20:24:06.52348043 +0000 UTC m=+516.188406100" lastFinishedPulling="2026-04-17 20:24:10.481524069 +0000 UTC m=+520.146449745" observedRunningTime="2026-04-17 20:24:10.94265028 +0000 UTC m=+520.607575970" watchObservedRunningTime="2026-04-17 20:24:10.945644448 +0000 UTC m=+520.610570140" Apr 17 20:24:10.967585 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.967516 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-h4t68" podStartSLOduration=1.005552091 podStartE2EDuration="3.967497006s" podCreationTimestamp="2026-04-17 20:24:07 +0000 UTC" firstStartedPulling="2026-04-17 20:24:07.581494035 +0000 UTC m=+517.246419706" lastFinishedPulling="2026-04-17 20:24:10.543438935 +0000 UTC m=+520.208364621" observedRunningTime="2026-04-17 20:24:10.965427849 +0000 UTC m=+520.630353540" watchObservedRunningTime="2026-04-17 20:24:10.967497006 +0000 UTC m=+520.632422703" Apr 17 20:24:10.989808 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:10.989680 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:12.899253 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:12.899210 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-96676" podUID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" containerName="authorino" containerID="cri-o://f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb" gracePeriod=30 Apr 17 20:24:13.148341 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.148315 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:13.277269 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.277188 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnljd\" (UniqueName: \"kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd\") pod \"20d3e749-ffa2-4a00-8ad6-060ad64664aa\" (UID: \"20d3e749-ffa2-4a00-8ad6-060ad64664aa\") " Apr 17 20:24:13.279405 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.279379 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd" (OuterVolumeSpecName: "kube-api-access-rnljd") pod "20d3e749-ffa2-4a00-8ad6-060ad64664aa" (UID: "20d3e749-ffa2-4a00-8ad6-060ad64664aa"). InnerVolumeSpecName "kube-api-access-rnljd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:13.378728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.378680 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnljd\" (UniqueName: \"kubernetes.io/projected/20d3e749-ffa2-4a00-8ad6-060ad64664aa-kube-api-access-rnljd\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:13.904687 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.904646 2579 generic.go:358] "Generic (PLEG): container finished" podID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" containerID="f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb" exitCode=0 Apr 17 20:24:13.905160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.904700 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-96676" Apr 17 20:24:13.905160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.904736 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-96676" event={"ID":"20d3e749-ffa2-4a00-8ad6-060ad64664aa","Type":"ContainerDied","Data":"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb"} Apr 17 20:24:13.905160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.904803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-96676" event={"ID":"20d3e749-ffa2-4a00-8ad6-060ad64664aa","Type":"ContainerDied","Data":"e575c68927fef3a9d5b84017b61653599427a5999c6dbeb2c0ac8ff6427e5984"} Apr 17 20:24:13.905160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.904823 2579 scope.go:117] "RemoveContainer" containerID="f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb" Apr 17 20:24:13.914640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.914617 2579 scope.go:117] "RemoveContainer" containerID="f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb" Apr 17 20:24:13.914942 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:24:13.914923 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb\": container with ID starting with f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb not found: ID does not exist" containerID="f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb" Apr 17 20:24:13.915019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.914958 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb"} err="failed to get container status \"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb\": rpc error: code = NotFound desc = could not find container \"f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb\": container with ID starting with f2733507d89f82581c32d0ade598eeddb59da71be825dde883e58a039b6108eb not found: ID does not exist" Apr 17 20:24:13.925484 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.925450 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:13.927811 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:13.927787 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-96676"] Apr 17 20:24:14.878581 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:14.878548 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" path="/var/lib/kubelet/pods/20d3e749-ffa2-4a00-8ad6-060ad64664aa/volumes" Apr 17 20:24:21.283732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.283639 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:21.284186 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.283947 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" podUID="24995d8b-44df-4f70-8094-382614d5f1a6" containerName="limitador" containerID="cri-o://15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b" gracePeriod=30 Apr 17 20:24:21.284664 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.284642 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:21.826853 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.826827 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:21.939351 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.939316 2579 generic.go:358] "Generic (PLEG): container finished" podID="24995d8b-44df-4f70-8094-382614d5f1a6" containerID="15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b" exitCode=0 Apr 17 20:24:21.939529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.939383 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" Apr 17 20:24:21.939529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.939406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" event={"ID":"24995d8b-44df-4f70-8094-382614d5f1a6","Type":"ContainerDied","Data":"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b"} Apr 17 20:24:21.939529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.939445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6qrqs" event={"ID":"24995d8b-44df-4f70-8094-382614d5f1a6","Type":"ContainerDied","Data":"808293acd9d7b765044f00ed788b47f13485543dbe11848adc4e58db7a29519e"} Apr 17 20:24:21.939529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.939462 2579 scope.go:117] "RemoveContainer" containerID="15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b" Apr 17 20:24:21.948529 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.948509 2579 scope.go:117] "RemoveContainer" containerID="15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b" Apr 17 20:24:21.949002 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:24:21.948799 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b\": container with ID starting with 15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b not found: ID does not exist" containerID="15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b" Apr 17 20:24:21.949128 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.949017 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b"} err="failed to get container status \"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b\": rpc error: code = NotFound desc = could not find container \"15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b\": container with ID starting with 15309bb83d50522bbf466e0ac428bae1107fa76f10186df607ca536e63ed9a9b not found: ID does not exist" Apr 17 20:24:21.956802 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.956776 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file\") pod \"24995d8b-44df-4f70-8094-382614d5f1a6\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " Apr 17 20:24:21.956951 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.956890 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c5bn\" (UniqueName: \"kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn\") pod \"24995d8b-44df-4f70-8094-382614d5f1a6\" (UID: \"24995d8b-44df-4f70-8094-382614d5f1a6\") " Apr 17 20:24:21.957118 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.957095 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file" (OuterVolumeSpecName: "config-file") pod "24995d8b-44df-4f70-8094-382614d5f1a6" (UID: "24995d8b-44df-4f70-8094-382614d5f1a6"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:24:21.957219 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.957202 2579 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/24995d8b-44df-4f70-8094-382614d5f1a6-config-file\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:21.959048 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:21.959025 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn" (OuterVolumeSpecName: "kube-api-access-5c5bn") pod "24995d8b-44df-4f70-8094-382614d5f1a6" (UID: "24995d8b-44df-4f70-8094-382614d5f1a6"). InnerVolumeSpecName "kube-api-access-5c5bn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:22.058269 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:22.058233 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5c5bn\" (UniqueName: \"kubernetes.io/projected/24995d8b-44df-4f70-8094-382614d5f1a6-kube-api-access-5c5bn\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:22.261438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:22.261398 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:22.267732 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:22.267702 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6qrqs"] Apr 17 20:24:22.881702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:22.881661 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24995d8b-44df-4f70-8094-382614d5f1a6" path="/var/lib/kubelet/pods/24995d8b-44df-4f70-8094-382614d5f1a6/volumes" Apr 17 20:24:27.023403 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023367 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-mtgz9"] Apr 17 20:24:27.023799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023731 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24995d8b-44df-4f70-8094-382614d5f1a6" containerName="limitador" Apr 17 20:24:27.023799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023757 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="24995d8b-44df-4f70-8094-382614d5f1a6" containerName="limitador" Apr 17 20:24:27.023799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023766 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" containerName="authorino" Apr 17 20:24:27.023799 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023771 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" containerName="authorino" Apr 17 20:24:27.023942 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023832 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="24995d8b-44df-4f70-8094-382614d5f1a6" containerName="limitador" Apr 17 20:24:27.023942 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.023844 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="20d3e749-ffa2-4a00-8ad6-060ad64664aa" containerName="authorino" Apr 17 20:24:27.028309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.028286 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.031666 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.030942 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 20:24:27.031666 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.030984 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-5hhmz\"" Apr 17 20:24:27.033838 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.033814 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mtgz9"] Apr 17 20:24:27.205925 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.205875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e698d0e1-d08e-4291-8466-488fbb4ef89f-data\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.206129 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.205951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s87c\" (UniqueName: \"kubernetes.io/projected/e698d0e1-d08e-4291-8466-488fbb4ef89f-kube-api-access-6s87c\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.307436 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.307332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e698d0e1-d08e-4291-8466-488fbb4ef89f-data\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.307436 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.307382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s87c\" (UniqueName: \"kubernetes.io/projected/e698d0e1-d08e-4291-8466-488fbb4ef89f-kube-api-access-6s87c\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.307734 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.307715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e698d0e1-d08e-4291-8466-488fbb4ef89f-data\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.314896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.314870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s87c\" (UniqueName: \"kubernetes.io/projected/e698d0e1-d08e-4291-8466-488fbb4ef89f-kube-api-access-6s87c\") pod \"postgres-868db5846d-mtgz9\" (UID: \"e698d0e1-d08e-4291-8466-488fbb4ef89f\") " pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.341398 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.341361 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:27.671779 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.671726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mtgz9"] Apr 17 20:24:27.673236 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:27.673201 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode698d0e1_d08e_4291_8466_488fbb4ef89f.slice/crio-a2b022674f82108bfae6674077d1edc1c5d809e58816ad8b22bad401587f6dcc WatchSource:0}: Error finding container a2b022674f82108bfae6674077d1edc1c5d809e58816ad8b22bad401587f6dcc: Status 404 returned error can't find the container with id a2b022674f82108bfae6674077d1edc1c5d809e58816ad8b22bad401587f6dcc Apr 17 20:24:27.965468 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:27.965377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mtgz9" event={"ID":"e698d0e1-d08e-4291-8466-488fbb4ef89f","Type":"ContainerStarted","Data":"a2b022674f82108bfae6674077d1edc1c5d809e58816ad8b22bad401587f6dcc"} Apr 17 20:24:32.555979 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:32.555957 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 20:24:32.990318 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:32.990278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mtgz9" event={"ID":"e698d0e1-d08e-4291-8466-488fbb4ef89f","Type":"ContainerStarted","Data":"395478aaf02167f4e88cb83a50ada25635b6790237faafa58228255eab940cd7"} Apr 17 20:24:32.990505 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:32.990411 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:33.005194 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:33.005138 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-mtgz9" podStartSLOduration=1.126013741 podStartE2EDuration="6.005122709s" podCreationTimestamp="2026-04-17 20:24:27 +0000 UTC" firstStartedPulling="2026-04-17 20:24:27.674481727 +0000 UTC m=+537.339407398" lastFinishedPulling="2026-04-17 20:24:32.55359069 +0000 UTC m=+542.218516366" observedRunningTime="2026-04-17 20:24:33.004318933 +0000 UTC m=+542.669244620" watchObservedRunningTime="2026-04-17 20:24:33.005122709 +0000 UTC m=+542.670048400" Apr 17 20:24:39.023160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.023131 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-mtgz9" Apr 17 20:24:39.509945 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.509905 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-2sn4g"] Apr 17 20:24:39.512378 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.512359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:39.518939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.518910 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-2sn4g"] Apr 17 20:24:39.620384 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.620342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkngp\" (UniqueName: \"kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp\") pod \"authorino-8b475cf9f-2sn4g\" (UID: \"d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb\") " pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:39.721315 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.721271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkngp\" (UniqueName: \"kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp\") pod \"authorino-8b475cf9f-2sn4g\" (UID: \"d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb\") " pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:39.723010 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.722985 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-2sn4g"] Apr 17 20:24:39.723264 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:24:39.723239 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rkngp], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-2sn4g" podUID="d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb" Apr 17 20:24:39.732939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:39.732911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkngp\" (UniqueName: \"kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp\") pod \"authorino-8b475cf9f-2sn4g\" (UID: \"d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb\") " pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:40.001598 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.001561 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-757bffb484-rcrqm"] Apr 17 20:24:40.020363 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.020325 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-757bffb484-rcrqm"] Apr 17 20:24:40.020545 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.020408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:40.020545 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.020453 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.022915 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.022884 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 20:24:40.026272 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.026249 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:40.124138 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.124097 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkngp\" (UniqueName: \"kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp\") pod \"d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb\" (UID: \"d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb\") " Apr 17 20:24:40.124315 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.124298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d6b5c13a-d9e1-405b-a334-54eba432d195-tls-cert\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.124372 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.124356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjjn\" (UniqueName: \"kubernetes.io/projected/d6b5c13a-d9e1-405b-a334-54eba432d195-kube-api-access-kjjjn\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.126478 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.126441 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp" (OuterVolumeSpecName: "kube-api-access-rkngp") pod "d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb" (UID: "d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb"). InnerVolumeSpecName "kube-api-access-rkngp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:40.225275 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.225236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d6b5c13a-d9e1-405b-a334-54eba432d195-tls-cert\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.225443 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.225302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjjn\" (UniqueName: \"kubernetes.io/projected/d6b5c13a-d9e1-405b-a334-54eba432d195-kube-api-access-kjjjn\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.225443 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.225360 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkngp\" (UniqueName: \"kubernetes.io/projected/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb-kube-api-access-rkngp\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:40.227824 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.227797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d6b5c13a-d9e1-405b-a334-54eba432d195-tls-cert\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.232694 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.232673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjjn\" (UniqueName: \"kubernetes.io/projected/d6b5c13a-d9e1-405b-a334-54eba432d195-kube-api-access-kjjjn\") pod \"authorino-757bffb484-rcrqm\" (UID: \"d6b5c13a-d9e1-405b-a334-54eba432d195\") " pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.334983 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.334897 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-757bffb484-rcrqm" Apr 17 20:24:40.464472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:40.464443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-757bffb484-rcrqm"] Apr 17 20:24:40.465991 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:40.465959 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b5c13a_d9e1_405b_a334_54eba432d195.slice/crio-9bc37d1edaa8fd27d0d27fc2e744ad8b8513ef18f81f5bf6ee3f7867e78ca92f WatchSource:0}: Error finding container 9bc37d1edaa8fd27d0d27fc2e744ad8b8513ef18f81f5bf6ee3f7867e78ca92f: Status 404 returned error can't find the container with id 9bc37d1edaa8fd27d0d27fc2e744ad8b8513ef18f81f5bf6ee3f7867e78ca92f Apr 17 20:24:41.022959 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.022890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-757bffb484-rcrqm" event={"ID":"d6b5c13a-d9e1-405b-a334-54eba432d195","Type":"ContainerStarted","Data":"6dc01a16c77a5a6d5375241ac7539c97119925b09b2d2ddeaa6009f396a7e1e2"} Apr 17 20:24:41.022959 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.022949 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-757bffb484-rcrqm" event={"ID":"d6b5c13a-d9e1-405b-a334-54eba432d195","Type":"ContainerStarted","Data":"9bc37d1edaa8fd27d0d27fc2e744ad8b8513ef18f81f5bf6ee3f7867e78ca92f"} Apr 17 20:24:41.023228 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.022974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-2sn4g" Apr 17 20:24:41.038824 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.038769 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-757bffb484-rcrqm" podStartSLOduration=1.630252741 podStartE2EDuration="2.038737304s" podCreationTimestamp="2026-04-17 20:24:39 +0000 UTC" firstStartedPulling="2026-04-17 20:24:40.467284113 +0000 UTC m=+550.132209782" lastFinishedPulling="2026-04-17 20:24:40.875768663 +0000 UTC m=+550.540694345" observedRunningTime="2026-04-17 20:24:41.036127312 +0000 UTC m=+550.701053002" watchObservedRunningTime="2026-04-17 20:24:41.038737304 +0000 UTC m=+550.703663020" Apr 17 20:24:41.064282 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.064054 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-2sn4g"] Apr 17 20:24:41.067923 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.067885 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-2sn4g"] Apr 17 20:24:41.071560 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.071532 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:41.071807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.071780 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-h4t68" podUID="800f372e-8a0c-44a6-82ce-ee20094d9273" containerName="authorino" containerID="cri-o://ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214" gracePeriod=30 Apr 17 20:24:41.303317 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.303285 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:41.334330 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.334288 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6bxk\" (UniqueName: \"kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk\") pod \"800f372e-8a0c-44a6-82ce-ee20094d9273\" (UID: \"800f372e-8a0c-44a6-82ce-ee20094d9273\") " Apr 17 20:24:41.336492 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.336461 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk" (OuterVolumeSpecName: "kube-api-access-p6bxk") pod "800f372e-8a0c-44a6-82ce-ee20094d9273" (UID: "800f372e-8a0c-44a6-82ce-ee20094d9273"). InnerVolumeSpecName "kube-api-access-p6bxk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:41.435467 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.435429 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6bxk\" (UniqueName: \"kubernetes.io/projected/800f372e-8a0c-44a6-82ce-ee20094d9273-kube-api-access-p6bxk\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:41.883068 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.883034 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:41.883524 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.883504 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="800f372e-8a0c-44a6-82ce-ee20094d9273" containerName="authorino" Apr 17 20:24:41.883587 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.883530 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="800f372e-8a0c-44a6-82ce-ee20094d9273" containerName="authorino" Apr 17 20:24:41.883644 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.883632 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="800f372e-8a0c-44a6-82ce-ee20094d9273" containerName="authorino" Apr 17 20:24:41.944595 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.944551 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:41.944781 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.944699 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:41.947091 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:41.947065 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-fwz42\"" Apr 17 20:24:42.034919 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.034837 2579 generic.go:358] "Generic (PLEG): container finished" podID="800f372e-8a0c-44a6-82ce-ee20094d9273" containerID="ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214" exitCode=0 Apr 17 20:24:42.034919 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.034882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-h4t68" event={"ID":"800f372e-8a0c-44a6-82ce-ee20094d9273","Type":"ContainerDied","Data":"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214"} Apr 17 20:24:42.034919 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.034899 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-h4t68" Apr 17 20:24:42.035160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.034929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-h4t68" event={"ID":"800f372e-8a0c-44a6-82ce-ee20094d9273","Type":"ContainerDied","Data":"114ac02a9a3bf8e8a7d2f20cf0d4b52ac711338ca577fd52dc73be6b192e7e33"} Apr 17 20:24:42.035160 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.034950 2579 scope.go:117] "RemoveContainer" containerID="ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214" Apr 17 20:24:42.040928 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.040885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sxk5\" (UniqueName: \"kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5\") pod \"maas-controller-6d4c8f55f9-8bh8b\" (UID: \"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2\") " pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:42.044937 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.044917 2579 scope.go:117] "RemoveContainer" containerID="ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214" Apr 17 20:24:42.045247 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:24:42.045226 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214\": container with ID starting with ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214 not found: ID does not exist" containerID="ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214" Apr 17 20:24:42.045322 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.045256 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214"} err="failed to get container status \"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214\": rpc error: code = NotFound desc = could not find container \"ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214\": container with ID starting with ea93be5e0a6c08d517899af468ed82fc03abfd1bac7cff0b96968651ffffe214 not found: ID does not exist" Apr 17 20:24:42.056653 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.056621 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:42.059248 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.059225 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-h4t68"] Apr 17 20:24:42.142518 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.142485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sxk5\" (UniqueName: \"kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5\") pod \"maas-controller-6d4c8f55f9-8bh8b\" (UID: \"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2\") " pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:42.152166 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.152134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sxk5\" (UniqueName: \"kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5\") pod \"maas-controller-6d4c8f55f9-8bh8b\" (UID: \"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2\") " pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:42.158144 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.158114 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:24:42.196687 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.196654 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:24:42.196871 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.196817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:42.243408 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.243372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22b9\" (UniqueName: \"kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9\") pod \"maas-controller-79f6b4d558-mrztc\" (UID: \"976011a1-f178-44b3-8594-6b5c8367c202\") " pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:42.255296 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.255266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:42.344346 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.344315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s22b9\" (UniqueName: \"kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9\") pod \"maas-controller-79f6b4d558-mrztc\" (UID: \"976011a1-f178-44b3-8594-6b5c8367c202\") " pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:42.352939 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.352910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22b9\" (UniqueName: \"kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9\") pod \"maas-controller-79f6b4d558-mrztc\" (UID: \"976011a1-f178-44b3-8594-6b5c8367c202\") " pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:42.387483 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.387458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:42.388541 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:42.388514 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553d2b22_df91_4cd2_a3fc_e8b41bc1e1d2.slice/crio-c19c1e0727f60489abae58f760593a11db03f7fb32dac76d3d2d601dbf034c18 WatchSource:0}: Error finding container c19c1e0727f60489abae58f760593a11db03f7fb32dac76d3d2d601dbf034c18: Status 404 returned error can't find the container with id c19c1e0727f60489abae58f760593a11db03f7fb32dac76d3d2d601dbf034c18 Apr 17 20:24:42.507628 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.507590 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:42.633987 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.633953 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:24:42.634785 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:24:42.634740 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod976011a1_f178_44b3_8594_6b5c8367c202.slice/crio-239f90bf9b92f1959b4d9b7d08d608fd32dd4f8bc52c5777a8230b7b9ac669b4 WatchSource:0}: Error finding container 239f90bf9b92f1959b4d9b7d08d608fd32dd4f8bc52c5777a8230b7b9ac669b4: Status 404 returned error can't find the container with id 239f90bf9b92f1959b4d9b7d08d608fd32dd4f8bc52c5777a8230b7b9ac669b4 Apr 17 20:24:42.879466 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.879431 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800f372e-8a0c-44a6-82ce-ee20094d9273" path="/var/lib/kubelet/pods/800f372e-8a0c-44a6-82ce-ee20094d9273/volumes" Apr 17 20:24:42.879954 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:42.879934 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb" path="/var/lib/kubelet/pods/d41d9837-1ce8-4b6f-9d6a-a4c4202fc5fb/volumes" Apr 17 20:24:43.044180 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:43.044136 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" event={"ID":"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2","Type":"ContainerStarted","Data":"c19c1e0727f60489abae58f760593a11db03f7fb32dac76d3d2d601dbf034c18"} Apr 17 20:24:43.046315 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:43.046218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79f6b4d558-mrztc" event={"ID":"976011a1-f178-44b3-8594-6b5c8367c202","Type":"ContainerStarted","Data":"239f90bf9b92f1959b4d9b7d08d608fd32dd4f8bc52c5777a8230b7b9ac669b4"} Apr 17 20:24:46.060609 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.060569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79f6b4d558-mrztc" event={"ID":"976011a1-f178-44b3-8594-6b5c8367c202","Type":"ContainerStarted","Data":"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515"} Apr 17 20:24:46.061122 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.060634 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:46.062022 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.062001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" event={"ID":"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2","Type":"ContainerStarted","Data":"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646"} Apr 17 20:24:46.062141 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.062120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:46.078076 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.078031 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-79f6b4d558-mrztc" podStartSLOduration=1.322669693 podStartE2EDuration="4.078016023s" podCreationTimestamp="2026-04-17 20:24:42 +0000 UTC" firstStartedPulling="2026-04-17 20:24:42.636280327 +0000 UTC m=+552.301206011" lastFinishedPulling="2026-04-17 20:24:45.391626672 +0000 UTC m=+555.056552341" observedRunningTime="2026-04-17 20:24:46.077170646 +0000 UTC m=+555.742096336" watchObservedRunningTime="2026-04-17 20:24:46.078016023 +0000 UTC m=+555.742941713" Apr 17 20:24:46.093205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:46.093157 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" podStartSLOduration=2.095763296 podStartE2EDuration="5.09314356s" podCreationTimestamp="2026-04-17 20:24:41 +0000 UTC" firstStartedPulling="2026-04-17 20:24:42.389934822 +0000 UTC m=+552.054860494" lastFinishedPulling="2026-04-17 20:24:45.387315087 +0000 UTC m=+555.052240758" observedRunningTime="2026-04-17 20:24:46.090679489 +0000 UTC m=+555.755605180" watchObservedRunningTime="2026-04-17 20:24:46.09314356 +0000 UTC m=+555.758069250" Apr 17 20:24:57.072035 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.072001 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:57.072472 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.072066 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:24:57.123829 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.123795 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:57.124066 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.124021 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" podUID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" containerName="manager" containerID="cri-o://41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646" gracePeriod=10 Apr 17 20:24:57.365395 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.365366 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:57.486417 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.486381 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sxk5\" (UniqueName: \"kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5\") pod \"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2\" (UID: \"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2\") " Apr 17 20:24:57.488654 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.488630 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5" (OuterVolumeSpecName: "kube-api-access-8sxk5") pod "553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" (UID: "553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2"). InnerVolumeSpecName "kube-api-access-8sxk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:57.587793 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:57.587722 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8sxk5\" (UniqueName: \"kubernetes.io/projected/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2-kube-api-access-8sxk5\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:24:58.112764 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.112709 2579 generic.go:358] "Generic (PLEG): container finished" podID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" containerID="41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646" exitCode=0 Apr 17 20:24:58.113208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.112778 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" event={"ID":"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2","Type":"ContainerDied","Data":"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646"} Apr 17 20:24:58.113208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.112817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" event={"ID":"553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2","Type":"ContainerDied","Data":"c19c1e0727f60489abae58f760593a11db03f7fb32dac76d3d2d601dbf034c18"} Apr 17 20:24:58.113208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.112834 2579 scope.go:117] "RemoveContainer" containerID="41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646" Apr 17 20:24:58.113208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.112784 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8bh8b" Apr 17 20:24:58.121847 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.121825 2579 scope.go:117] "RemoveContainer" containerID="41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646" Apr 17 20:24:58.122101 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:24:58.122081 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646\": container with ID starting with 41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646 not found: ID does not exist" containerID="41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646" Apr 17 20:24:58.122179 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.122114 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646"} err="failed to get container status \"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646\": rpc error: code = NotFound desc = could not find container \"41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646\": container with ID starting with 41dff41c7fc35391efa40c025d74d263d3466bc43a4d452cd1420b284bfe4646 not found: ID does not exist" Apr 17 20:24:58.135231 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.135203 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:58.137576 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.137550 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8bh8b"] Apr 17 20:24:58.884212 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:24:58.884174 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" path="/var/lib/kubelet/pods/553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2/volumes" Apr 17 20:25:11.195402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.195358 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:25:11.195936 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.195658 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-79f6b4d558-mrztc" podUID="976011a1-f178-44b3-8594-6b5c8367c202" containerName="manager" containerID="cri-o://a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515" gracePeriod=10 Apr 17 20:25:11.445937 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.445869 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:25:11.615695 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.615649 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22b9\" (UniqueName: \"kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9\") pod \"976011a1-f178-44b3-8594-6b5c8367c202\" (UID: \"976011a1-f178-44b3-8594-6b5c8367c202\") " Apr 17 20:25:11.617922 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.617885 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9" (OuterVolumeSpecName: "kube-api-access-s22b9") pod "976011a1-f178-44b3-8594-6b5c8367c202" (UID: "976011a1-f178-44b3-8594-6b5c8367c202"). InnerVolumeSpecName "kube-api-access-s22b9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:25:11.716709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:11.716608 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s22b9\" (UniqueName: \"kubernetes.io/projected/976011a1-f178-44b3-8594-6b5c8367c202-kube-api-access-s22b9\") on node \"ip-10-0-139-2.ec2.internal\" DevicePath \"\"" Apr 17 20:25:12.170043 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.170004 2579 generic.go:358] "Generic (PLEG): container finished" podID="976011a1-f178-44b3-8594-6b5c8367c202" containerID="a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515" exitCode=0 Apr 17 20:25:12.170240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.170072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79f6b4d558-mrztc" event={"ID":"976011a1-f178-44b3-8594-6b5c8367c202","Type":"ContainerDied","Data":"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515"} Apr 17 20:25:12.170240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.170091 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79f6b4d558-mrztc" Apr 17 20:25:12.170240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.170108 2579 scope.go:117] "RemoveContainer" containerID="a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515" Apr 17 20:25:12.170240 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.170098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79f6b4d558-mrztc" event={"ID":"976011a1-f178-44b3-8594-6b5c8367c202","Type":"ContainerDied","Data":"239f90bf9b92f1959b4d9b7d08d608fd32dd4f8bc52c5777a8230b7b9ac669b4"} Apr 17 20:25:12.182005 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.181986 2579 scope.go:117] "RemoveContainer" containerID="a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515" Apr 17 20:25:12.182313 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:25:12.182293 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515\": container with ID starting with a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515 not found: ID does not exist" containerID="a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515" Apr 17 20:25:12.182368 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.182322 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515"} err="failed to get container status \"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515\": rpc error: code = NotFound desc = could not find container \"a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515\": container with ID starting with a0401445a924d4d782d661beca2ec87c67fcd540cf9b531efc495d7de94d4515 not found: ID does not exist" Apr 17 20:25:12.194108 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.194076 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:25:12.199486 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.199459 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-79f6b4d558-mrztc"] Apr 17 20:25:12.879295 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:12.879262 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976011a1-f178-44b3-8594-6b5c8367c202" path="/var/lib/kubelet/pods/976011a1-f178-44b3-8594-6b5c8367c202/volumes" Apr 17 20:25:18.085208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085175 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-86c984498-ggc2l"] Apr 17 20:25:18.085667 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085590 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="976011a1-f178-44b3-8594-6b5c8367c202" containerName="manager" Apr 17 20:25:18.085667 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085603 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="976011a1-f178-44b3-8594-6b5c8367c202" containerName="manager" Apr 17 20:25:18.085667 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085617 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" containerName="manager" Apr 17 20:25:18.085667 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085623 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" containerName="manager" Apr 17 20:25:18.085840 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085693 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="553d2b22-df91-4cd2-a3fc-e8b41bc1e1d2" containerName="manager" Apr 17 20:25:18.085840 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.085705 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="976011a1-f178-44b3-8594-6b5c8367c202" containerName="manager" Apr 17 20:25:18.092612 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.092591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.095251 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.095225 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 20:25:18.095556 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.095389 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 20:25:18.095656 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.095587 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-z7qbr\"" Apr 17 20:25:18.098495 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.098468 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86c984498-ggc2l"] Apr 17 20:25:18.173724 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.173684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrss\" (UniqueName: \"kubernetes.io/projected/0c11c3fa-c709-4adc-a21c-1a35e0443193-kube-api-access-clrss\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.173938 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.173838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.274696 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.274662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clrss\" (UniqueName: \"kubernetes.io/projected/0c11c3fa-c709-4adc-a21c-1a35e0443193-kube-api-access-clrss\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.274906 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.274726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.274906 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:25:18.274875 2579 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 20:25:18.274987 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:25:18.274949 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls podName:0c11c3fa-c709-4adc-a21c-1a35e0443193 nodeName:}" failed. No retries permitted until 2026-04-17 20:25:18.77493127 +0000 UTC m=+588.439856952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls") pod "maas-api-86c984498-ggc2l" (UID: "0c11c3fa-c709-4adc-a21c-1a35e0443193") : secret "maas-api-serving-cert" not found Apr 17 20:25:18.287832 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.287804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrss\" (UniqueName: \"kubernetes.io/projected/0c11c3fa-c709-4adc-a21c-1a35e0443193-kube-api-access-clrss\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.778213 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.778165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:18.780706 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:18.780674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0c11c3fa-c709-4adc-a21c-1a35e0443193-maas-api-tls\") pod \"maas-api-86c984498-ggc2l\" (UID: \"0c11c3fa-c709-4adc-a21c-1a35e0443193\") " pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:19.004495 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:19.004460 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:19.138380 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:19.138352 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86c984498-ggc2l"] Apr 17 20:25:19.140810 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:25:19.140783 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c11c3fa_c709_4adc_a21c_1a35e0443193.slice/crio-7357fc6256349a27ac8e3ba069be470a6237a5b1c71d36de6b0947ed7ac2aaf1 WatchSource:0}: Error finding container 7357fc6256349a27ac8e3ba069be470a6237a5b1c71d36de6b0947ed7ac2aaf1: Status 404 returned error can't find the container with id 7357fc6256349a27ac8e3ba069be470a6237a5b1c71d36de6b0947ed7ac2aaf1 Apr 17 20:25:19.201208 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:19.201169 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86c984498-ggc2l" event={"ID":"0c11c3fa-c709-4adc-a21c-1a35e0443193","Type":"ContainerStarted","Data":"7357fc6256349a27ac8e3ba069be470a6237a5b1c71d36de6b0947ed7ac2aaf1"} Apr 17 20:25:22.214941 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:22.214902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86c984498-ggc2l" event={"ID":"0c11c3fa-c709-4adc-a21c-1a35e0443193","Type":"ContainerStarted","Data":"9d4f74b12fca4cf43d6420f39b4205ee18171f1163265076b426cb7b3824d8ab"} Apr 17 20:25:22.215330 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:22.214988 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:22.232483 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:22.232434 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-86c984498-ggc2l" podStartSLOduration=1.868689427 podStartE2EDuration="4.232420191s" podCreationTimestamp="2026-04-17 20:25:18 +0000 UTC" firstStartedPulling="2026-04-17 20:25:19.142463839 +0000 UTC m=+588.807389510" lastFinishedPulling="2026-04-17 20:25:21.506194606 +0000 UTC m=+591.171120274" observedRunningTime="2026-04-17 20:25:22.229819985 +0000 UTC m=+591.894745677" watchObservedRunningTime="2026-04-17 20:25:22.232420191 +0000 UTC m=+591.897345932" Apr 17 20:25:28.224155 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:28.224123 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-86c984498-ggc2l" Apr 17 20:25:30.803127 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:30.803095 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:25:30.805804 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:30.805780 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:25:45.317117 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.317077 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg"] Apr 17 20:25:45.321979 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.321958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.324297 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.324269 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 20:25:45.324415 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.324381 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 20:25:45.325309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.325293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 20:25:45.325358 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.325330 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vsxc9\"" Apr 17 20:25:45.330200 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.330065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg"] Apr 17 20:25:45.420576 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420543 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.420775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b748m\" (UniqueName: \"kubernetes.io/projected/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kube-api-access-b748m\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.420775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.420775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420674 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.420775 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.420943 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.420854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.521862 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.521829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.521926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.521963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.521989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b748m\" (UniqueName: \"kubernetes.io/projected/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kube-api-access-b748m\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.522027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.522061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522333 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.522295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522388 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.522373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.522539 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.522515 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.524451 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.524424 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.524588 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.524571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.529604 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.529582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b748m\" (UniqueName: \"kubernetes.io/projected/1e90011d-ca6d-4adf-a79f-d1f39ea0069b-kube-api-access-b748m\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-954qg\" (UID: \"1e90011d-ca6d-4adf-a79f-d1f39ea0069b\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.634796 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.634730 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:25:45.778860 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.778831 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg"] Apr 17 20:25:45.780260 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:25:45.780234 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e90011d_ca6d_4adf_a79f_d1f39ea0069b.slice/crio-fd24aa08fa158a65dfb2b72df3173c522ccbd8d668c5630396760f6b0f184f4d WatchSource:0}: Error finding container fd24aa08fa158a65dfb2b72df3173c522ccbd8d668c5630396760f6b0f184f4d: Status 404 returned error can't find the container with id fd24aa08fa158a65dfb2b72df3173c522ccbd8d668c5630396760f6b0f184f4d Apr 17 20:25:45.782139 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:45.782115 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:25:46.307764 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:46.307703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerStarted","Data":"fd24aa08fa158a65dfb2b72df3173c522ccbd8d668c5630396760f6b0f184f4d"} Apr 17 20:25:51.820603 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.820567 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk"] Apr 17 20:25:51.824327 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.824310 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.826445 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.826425 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 20:25:51.832152 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.832126 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk"] Apr 17 20:25:51.987159 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.987159 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5p2\" (UniqueName: \"kubernetes.io/projected/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kube-api-access-dq5p2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.987372 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.987427 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.987466 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987444 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:51.987502 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:51.987482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088232 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088508 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088508 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088508 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5p2\" (UniqueName: \"kubernetes.io/projected/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kube-api-access-dq5p2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088730 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088844 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.088920 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.088851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.090714 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.090687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.090967 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.090948 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.096599 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.096575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5p2\" (UniqueName: \"kubernetes.io/projected/302d51b4-fba4-4d9f-b1a0-c022c50b4dd2-kube-api-access-dq5p2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk\" (UID: \"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.135261 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.135218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:25:52.277992 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.277964 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk"] Apr 17 20:25:52.280393 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:25:52.280362 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302d51b4_fba4_4d9f_b1a0_c022c50b4dd2.slice/crio-921c308592bc5d4f14bd891051a6ecc5e2a0fddbce61b86d48f3b141e10d8758 WatchSource:0}: Error finding container 921c308592bc5d4f14bd891051a6ecc5e2a0fddbce61b86d48f3b141e10d8758: Status 404 returned error can't find the container with id 921c308592bc5d4f14bd891051a6ecc5e2a0fddbce61b86d48f3b141e10d8758 Apr 17 20:25:52.335421 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.335384 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerStarted","Data":"921c308592bc5d4f14bd891051a6ecc5e2a0fddbce61b86d48f3b141e10d8758"} Apr 17 20:25:52.336772 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:52.336720 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerStarted","Data":"a784c6f8aacad00fe74fbae67e65a4b81cdefce3f8e1a312e491268d2a930601"} Apr 17 20:25:53.343269 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:53.343223 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerStarted","Data":"9fae2e6ee884eb278fd8bc7c0733caef6461ff4a8b227ffc774fb437d33e377d"} Apr 17 20:25:57.364289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:57.364196 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="a784c6f8aacad00fe74fbae67e65a4b81cdefce3f8e1a312e491268d2a930601" exitCode=0 Apr 17 20:25:57.364289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:57.364275 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"a784c6f8aacad00fe74fbae67e65a4b81cdefce3f8e1a312e491268d2a930601"} Apr 17 20:25:58.370646 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:58.370559 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="9fae2e6ee884eb278fd8bc7c0733caef6461ff4a8b227ffc774fb437d33e377d" exitCode=0 Apr 17 20:25:58.371065 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:58.370635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"9fae2e6ee884eb278fd8bc7c0733caef6461ff4a8b227ffc774fb437d33e377d"} Apr 17 20:25:59.376417 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.376386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/0.log" Apr 17 20:25:59.376865 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.376670 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="df1df5ca6e180e924b4237c734467a1899c5ee1515680dddfdc622e2e81b4db7" exitCode=2 Apr 17 20:25:59.376865 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.376728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"df1df5ca6e180e924b4237c734467a1899c5ee1515680dddfdc622e2e81b4db7"} Apr 17 20:25:59.377222 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.377199 2579 scope.go:117] "RemoveContainer" containerID="df1df5ca6e180e924b4237c734467a1899c5ee1515680dddfdc622e2e81b4db7" Apr 17 20:25:59.378374 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.378359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/0.log" Apr 17 20:25:59.378696 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.378674 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="b154d2e9306ae08df8f416547cdc16b60544757b35a244db8e3d4b4e818c1fc7" exitCode=2 Apr 17 20:25:59.378801 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.378780 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"b154d2e9306ae08df8f416547cdc16b60544757b35a244db8e3d4b4e818c1fc7"} Apr 17 20:25:59.379224 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:25:59.379210 2579 scope.go:117] "RemoveContainer" containerID="b154d2e9306ae08df8f416547cdc16b60544757b35a244db8e3d4b4e818c1fc7" Apr 17 20:26:00.384536 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.384507 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/1.log" Apr 17 20:26:00.385002 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.384950 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/0.log" Apr 17 20:26:00.385297 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.385276 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870" exitCode=2 Apr 17 20:26:00.385366 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.385348 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870"} Apr 17 20:26:00.385417 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.385390 2579 scope.go:117] "RemoveContainer" containerID="df1df5ca6e180e924b4237c734467a1899c5ee1515680dddfdc622e2e81b4db7" Apr 17 20:26:00.385879 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.385858 2579 scope.go:117] "RemoveContainer" containerID="68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870" Apr 17 20:26:00.386099 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:00.386078 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:00.386908 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.386892 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/1.log" Apr 17 20:26:00.387251 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.387230 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/0.log" Apr 17 20:26:00.387531 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.387513 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2" exitCode=2 Apr 17 20:26:00.387602 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.387574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2"} Apr 17 20:26:00.387960 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.387945 2579 scope.go:117] "RemoveContainer" containerID="386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2" Apr 17 20:26:00.388148 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:00.388131 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:00.396897 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:00.396880 2579 scope.go:117] "RemoveContainer" containerID="b154d2e9306ae08df8f416547cdc16b60544757b35a244db8e3d4b4e818c1fc7" Apr 17 20:26:01.393343 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:01.393314 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/1.log" Apr 17 20:26:01.394963 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:01.394943 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/1.log" Apr 17 20:26:02.135867 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:02.135831 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:02.135867 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:02.135866 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:02.136371 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:02.136353 2579 scope.go:117] "RemoveContainer" containerID="68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870" Apr 17 20:26:02.136597 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:02.136572 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:05.635612 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:05.635564 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:05.635612 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:05.635608 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:05.636143 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:05.636068 2579 scope.go:117] "RemoveContainer" containerID="386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2" Apr 17 20:26:05.636275 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:05.636256 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:17.874611 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:17.874575 2579 scope.go:117] "RemoveContainer" containerID="68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870" Apr 17 20:26:18.468731 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.468694 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/2.log" Apr 17 20:26:18.469143 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.469127 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/1.log" Apr 17 20:26:18.469446 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.469423 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" exitCode=2 Apr 17 20:26:18.469521 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.469499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820"} Apr 17 20:26:18.469579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.469548 2579 scope.go:117] "RemoveContainer" containerID="68e23fdd4c25ea027afd923925406f487906d3082fa85db9948293074c4c1870" Apr 17 20:26:18.470087 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:18.470064 2579 scope.go:117] "RemoveContainer" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" Apr 17 20:26:18.470304 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:18.470282 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:19.474982 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:19.474951 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/2.log" Apr 17 20:26:19.873826 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:19.873719 2579 scope.go:117] "RemoveContainer" containerID="386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2" Apr 17 20:26:20.481067 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.480991 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/2.log" Apr 17 20:26:20.481475 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.481384 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/1.log" Apr 17 20:26:20.481709 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.481686 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" exitCode=2 Apr 17 20:26:20.481827 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.481770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37"} Apr 17 20:26:20.481827 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.481824 2579 scope.go:117] "RemoveContainer" containerID="386f2da8ed1cbc2cbf040b4b405bd9d4e5eb00398d56c900fdc5e9bb95afb3a2" Apr 17 20:26:20.482256 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:20.482235 2579 scope.go:117] "RemoveContainer" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" Apr 17 20:26:20.482512 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:20.482490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:21.486903 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:21.486873 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/2.log" Apr 17 20:26:22.136109 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:22.136078 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:22.136109 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:22.136114 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:22.136554 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:22.136540 2579 scope.go:117] "RemoveContainer" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" Apr 17 20:26:22.136738 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:22.136722 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:25.635941 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:25.635901 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:25.635941 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:25.635948 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:25.636460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:25.636444 2579 scope.go:117] "RemoveContainer" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" Apr 17 20:26:25.636682 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:25.636663 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:34.873933 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:34.873898 2579 scope.go:117] "RemoveContainer" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" Apr 17 20:26:34.874400 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:34.874071 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:39.874557 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:39.874519 2579 scope.go:117] "RemoveContainer" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" Apr 17 20:26:39.874972 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:39.874695 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:47.873878 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:47.873846 2579 scope.go:117] "RemoveContainer" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" Apr 17 20:26:48.596760 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.596726 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/3.log" Apr 17 20:26:48.597203 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.597185 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/2.log" Apr 17 20:26:48.597533 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.597506 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" exitCode=2 Apr 17 20:26:48.597646 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.597589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101"} Apr 17 20:26:48.597646 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.597631 2579 scope.go:117] "RemoveContainer" containerID="bde437c9b774242f8e1bc721022f77907abbae62a0e35e235c1238d10eb6b820" Apr 17 20:26:48.598061 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:48.598042 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:26:48.598283 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:48.598263 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:49.602826 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:49.602795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/3.log" Apr 17 20:26:52.136309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:52.136278 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:52.136309 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:52.136317 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:26:52.136740 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:52.136732 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:26:52.136990 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:52.136971 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:26:52.873880 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:52.873847 2579 scope.go:117] "RemoveContainer" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" Apr 17 20:26:53.619598 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.619518 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/3.log" Apr 17 20:26:53.620038 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.619927 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/2.log" Apr 17 20:26:53.620234 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.620213 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" exitCode=2 Apr 17 20:26:53.620308 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.620291 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce"} Apr 17 20:26:53.620346 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.620338 2579 scope.go:117] "RemoveContainer" containerID="aece50b96a00dfde88379fba12553e33f514956d879de3d69f9daa2e896a5a37" Apr 17 20:26:53.620720 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:53.620705 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:26:53.620971 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:53.620943 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:26:54.625206 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:54.625175 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/3.log" Apr 17 20:26:55.635632 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:55.635591 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:55.635632 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:55.635629 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:26:55.636123 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:26:55.636105 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:26:55.636311 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:26:55.636291 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:04.873880 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:04.873839 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:27:04.874370 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:04.874142 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:27:08.873852 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:08.873815 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:27:08.874282 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:08.873980 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:18.874655 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:18.874579 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:27:18.875154 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:18.874813 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:27:20.876783 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:20.876739 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:27:20.877187 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:20.876963 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:30.877510 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:30.877476 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:27:31.774498 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.774469 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/4.log" Apr 17 20:27:31.774883 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.774868 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/3.log" Apr 17 20:27:31.775197 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.775175 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" exitCode=2 Apr 17 20:27:31.775283 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.775219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9"} Apr 17 20:27:31.775283 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.775258 2579 scope.go:117] "RemoveContainer" containerID="f57d662be25b024ed64236a1cd002421c87131e76b1335a27fc8e99150008101" Apr 17 20:27:31.775788 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:31.775768 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:27:31.776024 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:31.776005 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:27:32.136188 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:32.136156 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:27:32.136586 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:32.136198 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:27:32.780863 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:32.780832 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/4.log" Apr 17 20:27:32.781614 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:32.781590 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:27:32.781832 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:32.781813 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:27:35.873940 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:35.873906 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:27:36.797704 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.797676 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/4.log" Apr 17 20:27:36.798097 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.798082 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/3.log" Apr 17 20:27:36.798419 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.798399 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" exitCode=2 Apr 17 20:27:36.798509 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.798486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec"} Apr 17 20:27:36.798570 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.798539 2579 scope.go:117] "RemoveContainer" containerID="69d73e9b879a51487c1d393ac99f42b004ef03f9cdbb42119be2c562efe5f9ce" Apr 17 20:27:36.799006 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:36.798992 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:27:36.799243 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:36.799220 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:37.804151 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:37.804124 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/4.log" Apr 17 20:27:45.634949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:45.634905 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:27:45.634949 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:45.634956 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:27:45.635399 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:45.635385 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:27:45.635607 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:45.635589 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:45.874386 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:45.874348 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:27:45.874588 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:45.874561 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:27:57.873998 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:57.873963 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:27:57.874427 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:57.874135 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:27:58.873888 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:27:58.873849 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:27:58.874115 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:27:58.874090 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:28:11.873852 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:11.873811 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:28:11.874402 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:11.873978 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:28:11.874402 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:11.874053 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:28:11.874402 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:11.874128 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:28:24.874298 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:24.874257 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:28:24.874911 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:24.874443 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:28:24.874911 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:24.874477 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:28:24.874911 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:24.874608 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:28:35.873864 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:35.873830 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:28:35.874289 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:35.874044 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:28:37.874028 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:37.873990 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:28:37.874427 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:37.874253 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:28:48.874041 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:48.873944 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:28:48.874614 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:48.874166 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:28:50.877439 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:28:50.877407 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:28:50.877944 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:28:50.877578 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:02.874217 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:02.874180 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:29:03.176294 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:03.176271 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/4.log" Apr 17 20:29:03.176932 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:03.176614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerStarted","Data":"2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b"} Apr 17 20:29:03.176932 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:03.176855 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:29:03.197118 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:03.197068 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podStartSLOduration=7.479917877 podStartE2EDuration="3m12.197052128s" podCreationTimestamp="2026-04-17 20:25:51 +0000 UTC" firstStartedPulling="2026-04-17 20:25:58.371389597 +0000 UTC m=+628.036315267" lastFinishedPulling="2026-04-17 20:29:03.088523848 +0000 UTC m=+812.753449518" observedRunningTime="2026-04-17 20:29:03.193780425 +0000 UTC m=+812.858706106" watchObservedRunningTime="2026-04-17 20:29:03.197052128 +0000 UTC m=+812.861977853" Apr 17 20:29:04.181994 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.181964 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:29:04.182439 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.182313 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/4.log" Apr 17 20:29:04.182592 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.182570 2579 generic.go:358] "Generic (PLEG): container finished" podID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" exitCode=2 Apr 17 20:29:04.182687 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.182663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" event={"ID":"302d51b4-fba4-4d9f-b1a0-c022c50b4dd2","Type":"ContainerDied","Data":"2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b"} Apr 17 20:29:04.182780 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.182718 2579 scope.go:117] "RemoveContainer" containerID="e93aa7f62f6959d49f116867765df21c189cb0d044caad3285943d0491c0eef9" Apr 17 20:29:04.183036 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.183021 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:04.183230 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:04.183207 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:04.873997 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:04.873960 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:29:05.187664 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.187638 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:29:05.188438 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.188414 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:05.188661 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:05.188639 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:05.189322 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.189306 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/5.log" Apr 17 20:29:05.189662 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.189647 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/4.log" Apr 17 20:29:05.189960 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.189941 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" exitCode=2 Apr 17 20:29:05.190037 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.190010 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" event={"ID":"1e90011d-ca6d-4adf-a79f-d1f39ea0069b","Type":"ContainerDied","Data":"0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b"} Apr 17 20:29:05.190097 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.190057 2579 scope.go:117] "RemoveContainer" containerID="d2ad675b8a5c4f3ff0028b44860031ddb455e7de42b5c1216465b56e6dc6f0ec" Apr 17 20:29:05.190463 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.190444 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:05.190692 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:05.190673 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:05.635566 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.635530 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:29:05.635566 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:05.635571 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" Apr 17 20:29:06.194917 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:06.194887 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/5.log" Apr 17 20:29:06.195640 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:06.195617 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:06.195855 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:06.195834 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:12.135881 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:12.135841 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" Apr 17 20:29:12.136406 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:12.136391 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:12.136592 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:12.136576 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:16.874261 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:16.874226 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:16.874642 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:16.874407 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:23.874154 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:23.874115 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:23.874630 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:23.874313 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:28.874334 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:28.874298 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:28.874770 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:28.874563 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:37.874259 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:37.874226 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:37.874693 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:37.874428 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:41.873960 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:41.873926 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:41.874351 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:41.874110 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:48.874132 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:48.874100 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:48.874515 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:48.874291 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:29:55.873728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:55.873695 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:29:55.874225 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:55.873906 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:29:59.874340 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:29:59.874309 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:29:59.874777 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:29:59.874512 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:30:07.873663 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:07.873632 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:30:07.874092 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:07.873863 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:30:11.873516 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:11.873470 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:30:11.874039 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:11.873662 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:30:18.873967 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:18.873884 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:30:18.874337 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:18.874057 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:30:23.874504 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:23.874470 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:30:23.874945 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:23.874671 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:30:30.817361 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.817332 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:30:30.818079 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.818064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/5.log" Apr 17 20:30:30.822669 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.822644 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:30:30.823252 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.823234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/5.log" Apr 17 20:30:30.837344 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.837318 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:30:30.841644 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:30.841620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:30:31.012855 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:31.012827 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:30:31.116948 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:31.116833 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:30:31.221638 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:31.221604 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/storage-initializer/0.log" Apr 17 20:30:31.874416 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:31.874385 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:30:31.874817 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:31.874591 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:30:32.081031 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:32.080992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-757bffb484-rcrqm_d6b5c13a-d9e1-405b-a334-54eba432d195/authorino/0.log" Apr 17 20:30:35.939265 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:35.939234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86c984498-ggc2l_0c11c3fa-c709-4adc-a21c-1a35e0443193/maas-api/0.log" Apr 17 20:30:36.391465 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:36.391428 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-2pwvr_c4ee013b-81d0-4948-b03e-47eb5ced8bfc/manager/0.log" Apr 17 20:30:36.722954 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:36.722879 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mtgz9_e698d0e1-d08e-4291-8466-488fbb4ef89f/postgres/0.log" Apr 17 20:30:37.455870 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.455839 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/extract/0.log" Apr 17 20:30:37.460845 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.460820 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/util/0.log" Apr 17 20:30:37.466135 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.466113 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/pull/0.log" Apr 17 20:30:37.569284 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.569254 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/util/0.log" Apr 17 20:30:37.575118 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.575092 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/pull/0.log" Apr 17 20:30:37.580357 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.580330 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/extract/0.log" Apr 17 20:30:37.687522 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.687495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/util/0.log" Apr 17 20:30:37.693579 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.693555 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/pull/0.log" Apr 17 20:30:37.699175 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.699147 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/extract/0.log" Apr 17 20:30:37.799788 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.799682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/pull/0.log" Apr 17 20:30:37.804928 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.804896 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/extract/0.log" Apr 17 20:30:37.813972 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.813948 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/util/0.log" Apr 17 20:30:37.920052 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:37.920022 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-757bffb484-rcrqm_d6b5c13a-d9e1-405b-a334-54eba432d195/authorino/0.log" Apr 17 20:30:38.130292 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:38.130267 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-v5hqn_7a177b04-5a6e-419d-8e30-76109480eb46/manager/0.log" Apr 17 20:30:38.345278 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:38.345234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-stdpt_7764d55c-5e37-47c3-9e21-f52d5eb3e0b3/registry-server/0.log" Apr 17 20:30:38.661195 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:38.661162 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2nqtj_7ebba3e7-ce6a-4679-a085-b93859f475f8/manager/0.log" Apr 17 20:30:38.874441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:38.874401 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:30:38.874687 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:38.874663 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:30:38.979565 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:38.979481 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj_6c8d819c-bde4-4f6a-8afd-61f822c02c75/istio-proxy/0.log" Apr 17 20:30:39.395304 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:39.395274 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-66qbf_e146b460-6bc7-439b-b7f1-513b847a73d3/istio-proxy/0.log" Apr 17 20:30:39.934101 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:39.934072 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/storage-initializer/0.log" Apr 17 20:30:39.939955 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:39.939934 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-954qg_1e90011d-ca6d-4adf-a79f-d1f39ea0069b/main/5.log" Apr 17 20:30:40.267868 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:40.267777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/storage-initializer/0.log" Apr 17 20:30:40.276441 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:40.276397 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_302d51b4-fba4-4d9f-b1a0-c022c50b4dd2/main/5.log" Apr 17 20:30:44.455785 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.455734 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-klrcl/must-gather-xf2gh"] Apr 17 20:30:44.459483 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.459466 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.462016 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.461993 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"kube-root-ca.crt\"" Apr 17 20:30:44.462220 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.462005 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"openshift-service-ca.crt\"" Apr 17 20:30:44.462856 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.462838 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-klrcl\"/\"default-dockercfg-b2tbp\"" Apr 17 20:30:44.475392 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.470645 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/must-gather-xf2gh"] Apr 17 20:30:44.598800 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.598758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-must-gather-output\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.598981 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.598820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brg2w\" (UniqueName: \"kubernetes.io/projected/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-kube-api-access-brg2w\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.700210 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.700165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-must-gather-output\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.700417 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.700229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brg2w\" (UniqueName: \"kubernetes.io/projected/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-kube-api-access-brg2w\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.700645 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.700615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-must-gather-output\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.709020 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.708952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brg2w\" (UniqueName: \"kubernetes.io/projected/89af3ec1-18d7-4f5d-bd4b-0496fab135b7-kube-api-access-brg2w\") pod \"must-gather-xf2gh\" (UID: \"89af3ec1-18d7-4f5d-bd4b-0496fab135b7\") " pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.778482 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.778437 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/must-gather-xf2gh" Apr 17 20:30:44.909289 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:44.909261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/must-gather-xf2gh"] Apr 17 20:30:44.911120 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:30:44.911090 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89af3ec1_18d7_4f5d_bd4b_0496fab135b7.slice/crio-3e9fbd5ceb8aabdc5a357c6ef1acd480482f7588af94d44e5b899af555ffab13 WatchSource:0}: Error finding container 3e9fbd5ceb8aabdc5a357c6ef1acd480482f7588af94d44e5b899af555ffab13: Status 404 returned error can't find the container with id 3e9fbd5ceb8aabdc5a357c6ef1acd480482f7588af94d44e5b899af555ffab13 Apr 17 20:30:45.603074 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:45.603031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/must-gather-xf2gh" event={"ID":"89af3ec1-18d7-4f5d-bd4b-0496fab135b7","Type":"ContainerStarted","Data":"3e9fbd5ceb8aabdc5a357c6ef1acd480482f7588af94d44e5b899af555ffab13"} Apr 17 20:30:45.861695 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:45.861671 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:30:45.875047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:45.874134 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:30:45.875047 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:45.874388 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:30:46.609501 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:46.609463 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/must-gather-xf2gh" event={"ID":"89af3ec1-18d7-4f5d-bd4b-0496fab135b7","Type":"ContainerStarted","Data":"6da0286513fbe21560121b87530261b9782eb90e41bbb28718595bd4ce7ff636"} Apr 17 20:30:46.609501 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:46.609510 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/must-gather-xf2gh" event={"ID":"89af3ec1-18d7-4f5d-bd4b-0496fab135b7","Type":"ContainerStarted","Data":"3a0ab860aa9d81eec414f89c50652d106a5cb3cae2bdbf4985f5de19d5be472b"} Apr 17 20:30:46.625182 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:46.625132 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-klrcl/must-gather-xf2gh" podStartSLOduration=1.8008692769999999 podStartE2EDuration="2.625117758s" podCreationTimestamp="2026-04-17 20:30:44 +0000 UTC" firstStartedPulling="2026-04-17 20:30:44.913247578 +0000 UTC m=+914.578173250" lastFinishedPulling="2026-04-17 20:30:45.73749606 +0000 UTC m=+915.402421731" observedRunningTime="2026-04-17 20:30:46.623201141 +0000 UTC m=+916.288126831" watchObservedRunningTime="2026-04-17 20:30:46.625117758 +0000 UTC m=+916.290043448" Apr 17 20:30:47.392047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:47.392012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-smnrp_52c55f78-79f4-41d2-8ed3-1f214a05f8ae/global-pull-secret-syncer/0.log" Apr 17 20:30:47.435523 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:47.435489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mcn8b_955d42b9-2b82-4faa-aa56-05c806e38889/konnectivity-agent/0.log" Apr 17 20:30:47.544888 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:47.544856 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-2.ec2.internal_2a95c6233e62f070e1ff4cac4a1fc713/haproxy/0.log" Apr 17 20:30:51.573171 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.573124 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/extract/0.log" Apr 17 20:30:51.615848 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.615813 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/util/0.log" Apr 17 20:30:51.655991 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.655290 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759rn4ww_76972add-7656-4d60-85e0-b01a86f15425/pull/0.log" Apr 17 20:30:51.685305 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.685128 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/extract/0.log" Apr 17 20:30:51.707930 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.707897 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/util/0.log" Apr 17 20:30:51.730063 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.730037 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0c22xk_9d7529f4-cf46-4ef3-a0fb-2c7c129ac06a/pull/0.log" Apr 17 20:30:51.764344 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.764308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/extract/0.log" Apr 17 20:30:51.789629 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.789592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/util/0.log" Apr 17 20:30:51.810574 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.810533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739b55r_fb665cb0-45bb-47ff-91cc-a45213b8e5b3/pull/0.log" Apr 17 20:30:51.837728 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.837652 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/extract/0.log" Apr 17 20:30:51.858884 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.858850 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/util/0.log" Apr 17 20:30:51.873717 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.873688 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:30:51.874260 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:51.874234 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:30:51.880454 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.880398 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1vqgw7_7d748162-4119-4a69-95dd-02301e4f557d/pull/0.log" Apr 17 20:30:51.914399 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.914372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-757bffb484-rcrqm_d6b5c13a-d9e1-405b-a334-54eba432d195/authorino/0.log" Apr 17 20:30:51.965849 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:51.965797 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-v5hqn_7a177b04-5a6e-419d-8e30-76109480eb46/manager/0.log" Apr 17 20:30:52.026944 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:52.026877 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-stdpt_7764d55c-5e37-47c3-9e21-f52d5eb3e0b3/registry-server/0.log" Apr 17 20:30:52.118608 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:52.118517 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2nqtj_7ebba3e7-ce6a-4679-a085-b93859f475f8/manager/0.log" Apr 17 20:30:53.785542 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:53.785512 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-d66dp_2c9bec13-455f-46f1-b0d0-62183c8c00c7/cluster-monitoring-operator/0.log" Apr 17 20:30:53.820088 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:53.820000 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2qf44_a5a95a60-e2cc-428e-b995-a69225112a29/kube-state-metrics/0.log" Apr 17 20:30:53.840968 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:53.840936 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2qf44_a5a95a60-e2cc-428e-b995-a69225112a29/kube-rbac-proxy-main/0.log" Apr 17 20:30:53.863876 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:53.863849 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2qf44_a5a95a60-e2cc-428e-b995-a69225112a29/kube-rbac-proxy-self/0.log" Apr 17 20:30:54.034314 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.034280 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zsg7_b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b/node-exporter/0.log" Apr 17 20:30:54.058895 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.058812 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zsg7_b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b/kube-rbac-proxy/0.log" Apr 17 20:30:54.083915 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.083879 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9zsg7_b4e5556b-eed0-4b0d-bdad-d95dc24a9f4b/init-textfile/0.log" Apr 17 20:30:54.415642 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.415600 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s82nl_f033e518-f79c-4c74-9235-1a284adf65c0/prometheus-operator/0.log" Apr 17 20:30:54.434143 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.434112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s82nl_f033e518-f79c-4c74-9235-1a284adf65c0/kube-rbac-proxy/0.log" Apr 17 20:30:54.457205 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.457166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qghqc_7e5f305d-f8a2-4e23-8714-04f855b755fb/prometheus-operator-admission-webhook/0.log" Apr 17 20:30:54.553193 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.553160 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/thanos-query/0.log" Apr 17 20:30:54.573703 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.573671 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/kube-rbac-proxy-web/0.log" Apr 17 20:30:54.605804 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.605769 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/kube-rbac-proxy/0.log" Apr 17 20:30:54.625537 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.625380 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/prom-label-proxy/0.log" Apr 17 20:30:54.648460 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.648424 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/kube-rbac-proxy-rules/0.log" Apr 17 20:30:54.668360 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:54.668288 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c55f7d6c7-bb687_ae81404b-bc25-469b-be3f-d5f02eb9709a/kube-rbac-proxy-metrics/0.log" Apr 17 20:30:56.051871 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.051833 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv"] Apr 17 20:30:56.059595 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.059564 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.062867 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.062838 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv"] Apr 17 20:30:56.122716 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.122675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-podres\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.123028 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.123006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvsx\" (UniqueName: \"kubernetes.io/projected/8737027b-f8ec-480f-9805-994b6bdaaee8-kube-api-access-lrvsx\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.123174 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.123156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-sys\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.123251 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.123206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-proc\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.123397 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.123370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-lib-modules\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.224931 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.224878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-proc\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225179 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-proc\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225336 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-lib-modules\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225533 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-lib-modules\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225604 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-podres\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225681 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvsx\" (UniqueName: \"kubernetes.io/projected/8737027b-f8ec-480f-9805-994b6bdaaee8-kube-api-access-lrvsx\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.225896 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-sys\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.226008 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-sys\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.226008 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.225990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8737027b-f8ec-480f-9805-994b6bdaaee8-podres\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.234000 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.233966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvsx\" (UniqueName: \"kubernetes.io/projected/8737027b-f8ec-480f-9805-994b6bdaaee8-kube-api-access-lrvsx\") pod \"perf-node-gather-daemonset-mhzxv\" (UID: \"8737027b-f8ec-480f-9805-994b6bdaaee8\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.259173 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.259135 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/1.log" Apr 17 20:30:56.265443 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.265414 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8cv7h_d7fe830e-bccc-4359-9b7c-afa06ecd5668/console-operator/2.log" Apr 17 20:30:56.374850 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.374736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.533810 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.533780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv"] Apr 17 20:30:56.536575 ip-10-0-139-2 kubenswrapper[2579]: W0417 20:30:56.536529 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8737027b_f8ec_480f_9805_994b6bdaaee8.slice/crio-32b6a972a5ef1d5872794aaf5ee98f557907b22ef114a73d41b05193fad94dbb WatchSource:0}: Error finding container 32b6a972a5ef1d5872794aaf5ee98f557907b22ef114a73d41b05193fad94dbb: Status 404 returned error can't find the container with id 32b6a972a5ef1d5872794aaf5ee98f557907b22ef114a73d41b05193fad94dbb Apr 17 20:30:56.690872 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.690832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" event={"ID":"8737027b-f8ec-480f-9805-994b6bdaaee8","Type":"ContainerStarted","Data":"5a33383ecd108fa14918f4a9921803fac684f4822641b33459d3f1945df0ac7b"} Apr 17 20:30:56.690872 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.690877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" event={"ID":"8737027b-f8ec-480f-9805-994b6bdaaee8","Type":"ContainerStarted","Data":"32b6a972a5ef1d5872794aaf5ee98f557907b22ef114a73d41b05193fad94dbb"} Apr 17 20:30:56.691103 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.690935 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:30:56.707344 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.707274 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" podStartSLOduration=0.707253026 podStartE2EDuration="707.253026ms" podCreationTimestamp="2026-04-17 20:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:30:56.706636863 +0000 UTC m=+926.371562553" watchObservedRunningTime="2026-04-17 20:30:56.707253026 +0000 UTC m=+926.372178717" Apr 17 20:30:56.751702 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:56.751672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b7744d46-4q2t4_8bea4f82-eaa0-42f1-8856-45a27b083b22/console/0.log" Apr 17 20:30:57.310196 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:57.310167 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-9k5w8_1b176a9f-c210-49e2-9286-ec8df6440b2b/volume-data-source-validator/0.log" Apr 17 20:30:58.156383 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.156355 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vkvbf_a083116d-3c26-492c-b99a-c51bbaa51aa4/dns/0.log" Apr 17 20:30:58.176158 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.176129 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vkvbf_a083116d-3c26-492c-b99a-c51bbaa51aa4/kube-rbac-proxy/0.log" Apr 17 20:30:58.240757 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.240710 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w9tk6_804b57c1-49b2-4e56-8da1-70a591e070e2/dns-node-resolver/0.log" Apr 17 20:30:58.691219 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.691189 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c8d5fbf5c-sfqff_0c1607fb-6a2c-4213-8de7-34c392a4fd1c/registry/0.log" Apr 17 20:30:58.694493 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.694467 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c8d5fbf5c-sfqff_0c1607fb-6a2c-4213-8de7-34c392a4fd1c/registry/1.log" Apr 17 20:30:58.711919 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:58.711897 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-msb4t_963f9ba4-1fc0-4858-aa10-5a4f1aaf9c18/node-ca/0.log" Apr 17 20:30:59.506616 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:59.506581 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfgdlbj_6c8d819c-bde4-4f6a-8afd-61f822c02c75/istio-proxy/0.log" Apr 17 20:30:59.626680 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:59.626651 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-66qbf_e146b460-6bc7-439b-b7f1-513b847a73d3/istio-proxy/0.log" Apr 17 20:30:59.874201 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:30:59.874123 2579 scope.go:117] "RemoveContainer" containerID="0918edf5476e20f408c8bc9b2513674eac2630251f258b0f167ab11021fa586b" Apr 17 20:30:59.874558 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:30:59.874299 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-954qg_llm(1e90011d-ca6d-4adf-a79f-d1f39ea0069b)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-954qg" podUID="1e90011d-ca6d-4adf-a79f-d1f39ea0069b" Apr 17 20:31:00.180889 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.180857 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zvwgs_159a3a3e-608e-405f-ac09-ff7186a9c710/serve-healthcheck-canary/0.log" Apr 17 20:31:00.604573 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.604490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gw79k_ee130621-350a-49cf-905b-3a5917dcd327/insights-operator/1.log" Apr 17 20:31:00.605023 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.605003 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gw79k_ee130621-350a-49cf-905b-3a5917dcd327/insights-operator/0.log" Apr 17 20:31:00.692047 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.692018 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6hz2t_5167bfd3-1c9b-4daa-a7ca-08927f909b5f/kube-rbac-proxy/0.log" Apr 17 20:31:00.712223 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.712196 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6hz2t_5167bfd3-1c9b-4daa-a7ca-08927f909b5f/exporter/0.log" Apr 17 20:31:00.734613 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:00.734565 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6hz2t_5167bfd3-1c9b-4daa-a7ca-08927f909b5f/extractor/0.log" Apr 17 20:31:02.614029 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:02.613998 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86c984498-ggc2l_0c11c3fa-c709-4adc-a21c-1a35e0443193/maas-api/0.log" Apr 17 20:31:02.709770 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:02.709725 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-mhzxv" Apr 17 20:31:02.715019 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:02.714994 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-2pwvr_c4ee013b-81d0-4948-b03e-47eb5ced8bfc/manager/0.log" Apr 17 20:31:02.808373 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:02.808338 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mtgz9_e698d0e1-d08e-4291-8466-488fbb4ef89f/postgres/0.log" Apr 17 20:31:03.904978 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:03.904946 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fd99964b4-6ddgt_ff4e6c44-621c-48b8-9698-958eb20c1f4b/manager/0.log" Apr 17 20:31:06.875068 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:06.875037 2579 scope.go:117] "RemoveContainer" containerID="2d9cfdddcaf572206e01513d56bae022f720b629cc83bf10f821a3349ea4413b" Apr 17 20:31:06.875581 ip-10-0-139-2 kubenswrapper[2579]: E0417 20:31:06.875391 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk_llm(302d51b4-fba4-4d9f-b1a0-c022c50b4dd2)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xntzk" podUID="302d51b4-fba4-4d9f-b1a0-c022c50b4dd2" Apr 17 20:31:08.670648 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:08.670612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tpfmp_1e417380-f1cf-4d7a-b044-4fb0022ce22c/kube-storage-version-migrator-operator/1.log" Apr 17 20:31:08.671527 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:08.671508 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tpfmp_1e417380-f1cf-4d7a-b044-4fb0022ce22c/kube-storage-version-migrator-operator/0.log" Apr 17 20:31:09.811293 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.811266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/kube-multus-additional-cni-plugins/0.log" Apr 17 20:31:09.837960 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.837914 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/egress-router-binary-copy/0.log" Apr 17 20:31:09.863142 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.863110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/cni-plugins/0.log" Apr 17 20:31:09.886455 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.886392 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/bond-cni-plugin/0.log" Apr 17 20:31:09.909621 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.909592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/routeoverride-cni/0.log" Apr 17 20:31:09.933050 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.933021 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/whereabouts-cni-bincopy/0.log" Apr 17 20:31:09.958068 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:09.958042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p25dr_b810562f-1e78-430b-bb52-4ddd48b17312/whereabouts-cni/0.log" Apr 17 20:31:10.169521 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:10.169493 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r82dk_80b2534b-c049-4a65-8cdb-fc90c54d1a82/kube-multus/0.log" Apr 17 20:31:10.191167 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:10.191136 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-842wl_fcb80713-90b2-4ae8-95b5-a07c24ab45e2/network-metrics-daemon/0.log" Apr 17 20:31:10.210384 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:10.210359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-842wl_fcb80713-90b2-4ae8-95b5-a07c24ab45e2/kube-rbac-proxy/0.log" Apr 17 20:31:11.109229 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.109196 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/ovn-controller/0.log" Apr 17 20:31:11.131891 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.131854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/ovn-acl-logging/0.log" Apr 17 20:31:11.151781 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.151756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/kube-rbac-proxy-node/0.log" Apr 17 20:31:11.172824 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.172794 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:31:11.194764 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.194724 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/northd/0.log" Apr 17 20:31:11.214277 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.214246 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/nbdb/0.log" Apr 17 20:31:11.233807 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.233777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/sbdb/0.log" Apr 17 20:31:11.352813 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:11.352781 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46w54_200427fe-0d95-4f14-9c75-fa998acab9e6/ovnkube-controller/0.log" Apr 17 20:31:12.909792 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:12.909734 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-tdbc5_ca2ad4aa-e46d-428a-8b10-9d150c00e450/check-endpoints/0.log" Apr 17 20:31:12.958137 ip-10-0-139-2 kubenswrapper[2579]: I0417 20:31:12.958101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-q6spr_e114821c-4bf6-4656-8172-0f7ba8948fdc/network-check-target-container/0.log"