Apr 18 02:45:56.758846 ip-10-0-128-79 systemd[1]: Starting Kubernetes Kubelet... Apr 18 02:45:57.138775 ip-10-0-128-79 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:57.138775 ip-10-0-128-79 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 18 02:45:57.138775 ip-10-0-128-79 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:57.138775 ip-10-0-128-79 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 18 02:45:57.138775 ip-10-0-128-79 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:57.141811 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.141723 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 18 02:45:57.145951 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145932 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:57.145951 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145948 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:57.145951 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145955 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145959 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145964 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145968 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145973 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145977 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145981 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145985 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145989 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145993 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.145996 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146000 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146004 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146007 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146012 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146016 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146020 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146024 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146030 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146035 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:57.146128 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146040 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146044 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146049 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146053 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146058 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146063 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146067 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146072 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146076 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146090 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146095 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146099 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146105 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146109 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146114 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146118 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146122 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146126 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146131 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146135 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:57.146946 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146140 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146144 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146148 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146153 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146157 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146161 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146165 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146169 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146174 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146178 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146183 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146189 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146194 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146199 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146203 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146208 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146212 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146217 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146221 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:57.147603 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146225 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146229 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146234 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146239 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146243 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146247 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146252 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146257 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146262 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146266 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146270 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146274 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146278 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146284 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146288 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146292 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146297 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146301 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146305 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146309 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:57.148069 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146313 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146317 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146321 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146326 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.146330 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148605 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148616 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148621 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148626 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148630 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148635 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148639 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148644 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148649 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148653 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:57.148648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148657 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148661 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148665 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148670 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148674 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148678 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148682 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148686 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148690 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148694 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148699 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148703 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148707 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148711 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148715 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148719 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148723 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148727 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148733 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148737 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:57.149277 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148741 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148745 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148749 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148753 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148760 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148765 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148770 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148774 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148779 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148783 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148787 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148791 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148795 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148799 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148804 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148808 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148813 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148817 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148822 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:57.150135 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148825 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148830 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148834 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148838 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148842 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148847 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148851 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148855 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148859 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148864 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148868 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148871 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148876 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148879 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148884 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148888 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148892 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148896 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148900 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148904 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:57.150785 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148908 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148912 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148916 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148920 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148924 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148928 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148932 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148937 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148941 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148945 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148952 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148956 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148960 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148965 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148969 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148973 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.148979 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149078 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149090 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149103 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149111 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 18 02:45:57.151358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149117 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149123 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149129 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149136 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149141 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149146 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149151 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149157 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149162 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149167 2577 flags.go:64] FLAG: --cgroup-root="" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149172 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149177 2577 flags.go:64] FLAG: --client-ca-file="" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149181 2577 flags.go:64] FLAG: --cloud-config="" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149186 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149191 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149197 2577 flags.go:64] FLAG: --cluster-domain="" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149201 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149207 2577 flags.go:64] FLAG: --config-dir="" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149213 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149219 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149225 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149230 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149237 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149242 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 18 02:45:57.151989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149247 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149252 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149257 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149262 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149267 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149273 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149278 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149283 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149287 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149292 2577 flags.go:64] FLAG: --enable-server="true" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149300 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149307 2577 flags.go:64] FLAG: --event-burst="100" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149313 2577 flags.go:64] FLAG: --event-qps="50" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149318 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149324 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149328 2577 flags.go:64] FLAG: --eviction-hard="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149334 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149339 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149344 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149350 2577 flags.go:64] FLAG: --eviction-soft="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149355 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149360 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149365 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149370 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149375 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 18 02:45:57.152743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149379 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149384 2577 flags.go:64] FLAG: --feature-gates="" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149391 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149396 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149401 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149406 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149412 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149416 2577 flags.go:64] FLAG: --help="false" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149421 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149427 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149432 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149437 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149442 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149448 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149453 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149457 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149462 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149469 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149474 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149479 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149484 2577 flags.go:64] FLAG: --kube-reserved="" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149488 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149493 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 18 02:45:57.153373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149498 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149503 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149507 2577 flags.go:64] FLAG: --lock-file="" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149512 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149517 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149522 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149530 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149535 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149539 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149544 2577 flags.go:64] FLAG: --logging-format="text" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149566 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149572 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149577 2577 flags.go:64] FLAG: --manifest-url="" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149581 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149588 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149594 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149610 2577 flags.go:64] FLAG: --max-pods="110" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149616 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149621 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149626 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149630 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149635 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149640 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149645 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149664 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 18 02:45:57.153998 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149669 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149678 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149684 2577 flags.go:64] FLAG: --pod-cidr="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149688 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149698 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149702 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149707 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149712 2577 flags.go:64] FLAG: --port="10250" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149717 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149722 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0390431e435108f1d" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149727 2577 flags.go:64] FLAG: --qos-reserved="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149733 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149738 2577 flags.go:64] FLAG: --register-node="true" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149743 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149747 2577 flags.go:64] FLAG: --register-with-taints="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149753 2577 flags.go:64] FLAG: --registry-burst="10" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149758 2577 flags.go:64] FLAG: --registry-qps="5" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149763 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149768 2577 flags.go:64] FLAG: --reserved-memory="" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149774 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149779 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149784 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149788 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149793 2577 flags.go:64] FLAG: --runonce="false" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149798 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 18 02:45:57.154672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149803 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149809 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149814 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149819 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149824 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149829 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149833 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149838 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149845 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149851 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149855 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149860 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149865 2577 flags.go:64] FLAG: --system-cgroups="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149869 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149878 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149882 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149887 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149893 2577 flags.go:64] FLAG: --tls-min-version="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149898 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149903 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149907 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149912 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149917 2577 flags.go:64] FLAG: --v="2" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149923 2577 flags.go:64] FLAG: --version="false" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149930 2577 flags.go:64] FLAG: --vmodule="" Apr 18 02:45:57.155261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149936 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.149942 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150089 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150095 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150100 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150104 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150109 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150114 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150118 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150122 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150126 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150130 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150135 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150139 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150143 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150150 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150154 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150158 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150163 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150167 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:57.155918 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150172 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150176 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150180 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150184 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150188 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150194 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150198 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150202 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150206 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150211 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150215 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150219 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150223 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150227 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150232 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150236 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150240 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150244 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150248 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150253 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:57.156388 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150258 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150262 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150266 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150270 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150274 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150278 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150282 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150286 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150290 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150294 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150301 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150305 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150309 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150313 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150317 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150322 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150326 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150330 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150334 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150338 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:57.157027 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150345 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150351 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150356 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150361 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150365 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150369 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150373 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150378 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150382 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150386 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150390 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150394 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150399 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150404 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150408 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150412 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150416 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150420 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150424 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150428 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:57.157825 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150433 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150436 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150445 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150450 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150455 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150459 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150464 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.150468 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:57.158373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.151077 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:57.159425 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.159407 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 18 02:45:57.159467 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.159427 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159473 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159478 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159482 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159485 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159488 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159490 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159493 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159496 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159499 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159502 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:57.159499 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159505 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159508 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159510 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159513 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159517 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159519 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159521 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159524 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159527 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159529 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159532 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159535 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159538 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159543 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159560 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159565 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159569 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159573 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159576 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159579 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:57.159788 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159581 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159584 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159587 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159589 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159592 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159595 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159597 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159600 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159602 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159605 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159607 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159610 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159613 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159615 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159618 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159621 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159624 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159627 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159630 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:57.160275 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159632 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159635 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159637 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159640 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159643 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159645 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159649 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159653 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159656 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159659 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159661 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159664 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159666 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159669 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159671 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159674 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159676 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159679 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159682 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:57.160833 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159686 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159690 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159694 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159696 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159699 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159702 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159705 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159707 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159712 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159714 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159717 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159720 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159722 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159725 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159728 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159730 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159733 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:57.161297 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159736 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.159741 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159854 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159860 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159864 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159867 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159870 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159872 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159875 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159878 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159881 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159884 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159886 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159889 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159891 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159894 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:57.161723 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159896 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159899 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159902 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159904 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159906 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159909 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159912 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159916 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159923 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159926 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159928 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159931 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159934 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159937 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159941 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159943 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159946 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159949 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159951 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:57.162109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159954 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159957 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159959 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159962 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159964 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159966 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159969 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159971 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159974 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159976 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159979 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159981 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159984 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159986 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159989 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159992 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159994 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159997 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.159999 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160002 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:57.162569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160004 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160007 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160010 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160013 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160015 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160018 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160020 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160023 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160025 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160028 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160030 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160033 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160035 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160038 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160040 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160042 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160045 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160047 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160050 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160052 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:57.163046 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160055 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160058 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160060 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160063 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160065 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160067 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160070 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160072 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160075 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160078 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160081 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160083 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:57.160086 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.160090 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:57.163522 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.160810 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 18 02:45:57.164775 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.164761 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 18 02:45:57.165590 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.165579 2577 server.go:1019] "Starting client certificate rotation" Apr 18 02:45:57.165693 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.165674 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:45:57.165725 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.165721 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:45:57.188034 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.188016 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:45:57.191919 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.191902 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:45:57.206380 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.206330 2577 log.go:25] "Validated CRI v1 runtime API" Apr 18 02:45:57.212434 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.212417 2577 log.go:25] "Validated CRI v1 image API" Apr 18 02:45:57.213605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.213580 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 18 02:45:57.217022 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.217006 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:45:57.217164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.217147 2577 fs.go:135] Filesystem UUIDs: map[3de1f2aa-fb12-4383-9ed0-89e08ec38b7d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8bc8850e-b506-435f-bbb1-5e147eaffb41:/dev/nvme0n1p3] Apr 18 02:45:57.217200 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.217165 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 18 02:45:57.223441 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.223322 2577 manager.go:217] Machine: {Timestamp:2026-04-18 02:45:57.221574089 +0000 UTC m=+0.361425414 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098602 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec256b476415263f37b34ec954c5b0cd SystemUUID:ec256b47-6415-263f-37b3-4ec954c5b0cd BootID:09c82ed5-6df7-485f-8de1-8a42e56d2e67 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f6:97:f0:2a:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f6:97:f0:2a:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:c9:e9:f3:e5:a0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 18 02:45:57.223441 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.223431 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 18 02:45:57.223591 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.223504 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 18 02:45:57.224542 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.224519 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 18 02:45:57.224692 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.224544 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-79.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 18 02:45:57.224739 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.224701 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 18 02:45:57.224739 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.224709 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 18 02:45:57.224739 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.224722 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:45:57.225337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.225327 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:45:57.226740 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.226730 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:45:57.226838 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.226829 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 18 02:45:57.229617 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.229605 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 18 02:45:57.229617 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.229619 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 18 02:45:57.229700 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.229636 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 18 02:45:57.229700 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.229644 2577 kubelet.go:397] "Adding apiserver pod source" Apr 18 02:45:57.229700 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.229653 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 18 02:45:57.230963 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.230952 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:45:57.231008 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.230970 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:45:57.233644 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.233621 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 18 02:45:57.234992 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.234979 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 18 02:45:57.235131 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.235117 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2w9hw" Apr 18 02:45:57.236454 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236442 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 18 02:45:57.236488 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236464 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 18 02:45:57.236488 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236473 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 18 02:45:57.236488 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236483 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236492 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236500 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236509 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236518 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236528 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236536 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236573 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 18 02:45:57.236605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.236587 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 18 02:45:57.237401 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.237391 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 18 02:45:57.237401 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.237401 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 18 02:45:57.240900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.240887 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 18 02:45:57.240963 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.240917 2577 server.go:1295] "Started kubelet" Apr 18 02:45:57.241031 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.241004 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 18 02:45:57.241067 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.241007 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 18 02:45:57.241097 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.241070 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 18 02:45:57.241478 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.241465 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-79.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 18 02:45:57.241665 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.241643 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-79.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 18 02:45:57.241720 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.241709 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 18 02:45:57.242027 ip-10-0-128-79 systemd[1]: Started Kubernetes Kubelet. Apr 18 02:45:57.242290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.242270 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2w9hw" Apr 18 02:45:57.242382 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.242372 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 18 02:45:57.243907 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.243892 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 18 02:45:57.249243 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.249226 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 18 02:45:57.249746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.249734 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 18 02:45:57.250508 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250489 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 18 02:45:57.250508 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250492 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 18 02:45:57.250659 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250524 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 18 02:45:57.250659 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.250540 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 18 02:45:57.250659 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250629 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 18 02:45:57.250659 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250637 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 18 02:45:57.250800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250736 2577 factory.go:55] Registering systemd factory Apr 18 02:45:57.250800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250762 2577 factory.go:223] Registration of the systemd container factory successfully Apr 18 02:45:57.250904 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.250812 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.251000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.250987 2577 factory.go:153] Registering CRI-O factory Apr 18 02:45:57.251052 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.251012 2577 factory.go:223] Registration of the crio container factory successfully Apr 18 02:45:57.251101 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.251066 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 18 02:45:57.251134 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.251114 2577 factory.go:103] Registering Raw factory Apr 18 02:45:57.251134 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.251130 2577 manager.go:1196] Started watching for new ooms in manager Apr 18 02:45:57.251633 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.251616 2577 manager.go:319] Starting recovery of all containers Apr 18 02:45:57.256503 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.256331 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-79.ec2.internal\" not found" node="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.257961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.257930 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:57.264456 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.264439 2577 manager.go:324] Recovery completed Apr 18 02:45:57.268324 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.268312 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.270699 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.270684 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.270772 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.270711 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.270772 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.270724 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.271212 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.271195 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 18 02:45:57.271212 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.271212 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 18 02:45:57.271338 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.271230 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:45:57.274837 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.274821 2577 policy_none.go:49] "None policy: Start" Apr 18 02:45:57.274837 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.274839 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 18 02:45:57.274949 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.274851 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.322649 2577 manager.go:341] "Starting Device Plugin manager" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.322678 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.322688 2577 server.go:85] "Starting device plugin registration server" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.322942 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.322977 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.323073 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.323154 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.323164 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.323671 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 18 02:45:57.340669 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.323708 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.356250 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.356230 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 18 02:45:57.357396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.357381 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 18 02:45:57.357457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.357405 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 18 02:45:57.357457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.357421 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 18 02:45:57.357457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.357429 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 18 02:45:57.357570 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.357467 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 18 02:45:57.359753 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.359735 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:57.423836 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.423813 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.424819 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.424806 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.424886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.424831 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.424886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.424841 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.424886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.424874 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.433248 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.433234 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.433298 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.433253 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-79.ec2.internal\": node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.447636 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.447615 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.457830 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.457789 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal"] Apr 18 02:45:57.457884 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.457839 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.458564 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.458533 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.458642 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.458577 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.458642 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.458595 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.460926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.460915 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.461102 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.461138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461122 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.461931 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461914 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.462012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461941 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.462012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461920 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.462012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461955 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.462012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461982 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.462012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.461997 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.464294 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.464278 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.464361 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.464303 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:57.464942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.464927 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:57.465030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.464954 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:57.465030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.464984 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:57.478119 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.478097 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-79.ec2.internal\" not found" node="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.481925 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.481910 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-79.ec2.internal\" not found" node="ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.547700 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.547681 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.551965 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.551951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.552016 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.551973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.552016 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.551991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f163ec50d423dcd51089184ab62a7d6-config\") pod \"kube-apiserver-proxy-ip-10-0-128-79.ec2.internal\" (UID: \"5f163ec50d423dcd51089184ab62a7d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.648603 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.648579 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.652920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.652904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.652988 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.652938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.652988 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.652954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f163ec50d423dcd51089184ab62a7d6-config\") pod \"kube-apiserver-proxy-ip-10-0-128-79.ec2.internal\" (UID: \"5f163ec50d423dcd51089184ab62a7d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.653054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.652993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f163ec50d423dcd51089184ab62a7d6-config\") pod \"kube-apiserver-proxy-ip-10-0-128-79.ec2.internal\" (UID: \"5f163ec50d423dcd51089184ab62a7d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.653054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.653003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.653054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.653007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/104dca8b2a35645e419e747d78276979-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal\" (UID: \"104dca8b2a35645e419e747d78276979\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.749197 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.749177 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.781644 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.781626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.784068 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:57.784049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:57.849824 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.849802 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:57.950228 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:57.950211 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.050644 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.050586 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.151052 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.151028 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.165309 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.165293 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 18 02:45:58.165424 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.165410 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:45:58.165493 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.165464 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:45:58.244765 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.244726 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-17 02:40:57 +0000 UTC" deadline="2028-01-18 19:20:38.875149496 +0000 UTC" Apr 18 02:45:58.244765 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.244754 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15376h34m40.630399611s" Apr 18 02:45:58.250183 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.250158 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 18 02:45:58.251253 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.251234 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.260271 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.260253 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:45:58.279301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.279276 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sc7qf" Apr 18 02:45:58.286235 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.286217 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sc7qf" Apr 18 02:45:58.299770 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:58.299748 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104dca8b2a35645e419e747d78276979.slice/crio-ad56e3122d9db4df30a17c09c45f00691417c44f4ca541173650d58dfeb44617 WatchSource:0}: Error finding container ad56e3122d9db4df30a17c09c45f00691417c44f4ca541173650d58dfeb44617: Status 404 returned error can't find the container with id ad56e3122d9db4df30a17c09c45f00691417c44f4ca541173650d58dfeb44617 Apr 18 02:45:58.299944 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:58.299926 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f163ec50d423dcd51089184ab62a7d6.slice/crio-a3d18cfc55c4af7b6bd0a6695aa2960ec526e4bd5bd691f6aa07e5950666c3dc WatchSource:0}: Error finding container a3d18cfc55c4af7b6bd0a6695aa2960ec526e4bd5bd691f6aa07e5950666c3dc: Status 404 returned error can't find the container with id a3d18cfc55c4af7b6bd0a6695aa2960ec526e4bd5bd691f6aa07e5950666c3dc Apr 18 02:45:58.304391 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.304378 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:45:58.352055 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.352031 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.360772 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.360733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" event={"ID":"104dca8b2a35645e419e747d78276979","Type":"ContainerStarted","Data":"ad56e3122d9db4df30a17c09c45f00691417c44f4ca541173650d58dfeb44617"} Apr 18 02:45:58.361624 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.361605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" event={"ID":"5f163ec50d423dcd51089184ab62a7d6","Type":"ContainerStarted","Data":"a3d18cfc55c4af7b6bd0a6695aa2960ec526e4bd5bd691f6aa07e5950666c3dc"} Apr 18 02:45:58.452986 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.452965 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.553490 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.553437 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.653914 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.653890 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.732681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.732659 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:58.755003 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:58.754973 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-79.ec2.internal\" not found" Apr 18 02:45:58.821096 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.820872 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:58.850288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.850263 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" Apr 18 02:45:58.857282 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.857262 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:45:58.858314 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.858295 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" Apr 18 02:45:58.866394 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:58.866375 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:45:59.230776 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.230754 2577 apiserver.go:52] "Watching apiserver" Apr 18 02:45:59.238624 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.238605 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 18 02:45:59.240643 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.240617 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4x4ql","kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm","openshift-image-registry/node-ca-cn84g","openshift-multus/multus-9ffdc","openshift-multus/multus-additional-cni-plugins-dd75m","openshift-ovn-kubernetes/ovnkube-node-cf2h2","openshift-cluster-node-tuning-operator/tuned-hrnvq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal","openshift-multus/network-metrics-daemon-6xc88","openshift-network-diagnostics/network-check-target-94m6z","openshift-network-operator/iptables-alerter-zwm9t"] Apr 18 02:45:59.243947 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.243928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.246220 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.246055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.246321 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.246263 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 18 02:45:59.246321 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.246289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 18 02:45:59.246644 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.246613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gsrlj\"" Apr 18 02:45:59.248569 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.248527 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.249229 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.249207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 18 02:45:59.249462 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.249443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.249591 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.249545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6mdf7\"" Apr 18 02:45:59.252443 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.252419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.254796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.254538 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 18 02:45:59.254796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.254593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wff7q\"" Apr 18 02:45:59.254796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.254639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.254796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.254701 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.254796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.254788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.257065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.257203 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.257808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qb6fj\"" Apr 18 02:45:59.257808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 18 02:45:59.257808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257486 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.257808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257676 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 18 02:45:59.257808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.257746 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.259520 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.259838 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1cbb8019-14fc-48b5-b072-319e2f45207e-konnectivity-ca\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.259943 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rrd\" (UniqueName: \"kubernetes.io/projected/10696293-8e1c-431c-8a24-8d0bfab036d1-kube-api-access-45rrd\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.259943 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-socket-dir-parent\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.259943 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-k8s-cni-cncf-io\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.259943 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259927 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-device-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-host\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.259992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cnibin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-bin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-multus\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-hostroot\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.260180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-sys-fs\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwmb\" (UniqueName: \"kubernetes.io/projected/1f8ead79-25ed-4501-ab2c-99de1d600ce7-kube-api-access-ljwmb\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1cbb8019-14fc-48b5-b072-319e2f45207e-agent-certs\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-registration-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x56ww\" (UniqueName: \"kubernetes.io/projected/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-kube-api-access-x56ww\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-os-release\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-conf-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-system-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-etc-kubernetes\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cni-binary-copy\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.260516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-netns\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.261115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-daemon-config\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.261115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-multus-certs\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.261115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-socket-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.261115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-serviceca\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.261115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.260719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-kubelet\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.261563 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.261119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 18 02:45:59.261781 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.261766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 18 02:45:59.261914 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.261898 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.261974 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.261962 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 18 02:45:59.262015 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.261998 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 18 02:45:59.262158 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262138 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 18 02:45:59.262227 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262189 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 18 02:45:59.262227 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.262322 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262139 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-klbqc\"" Apr 18 02:45:59.262376 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqqp8\"" Apr 18 02:45:59.262538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.262626 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.262831 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.262817 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqngg\"" Apr 18 02:45:59.265438 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.265394 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.265537 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.265474 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:45:59.265537 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.265487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:45:59.265682 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.265540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:45:59.267905 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.267879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.270106 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.270084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 18 02:45:59.270247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.270229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 18 02:45:59.270247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.270236 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:45:59.270371 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.270299 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lv478\"" Apr 18 02:45:59.286862 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.286782 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:40:58 +0000 UTC" deadline="2027-11-28 08:05:42.955387014 +0000 UTC" Apr 18 02:45:59.286976 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.286866 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14141h19m43.66852762s" Apr 18 02:45:59.352164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.352139 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 18 02:45:59.361460 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-tuned\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.361605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.361605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1cbb8019-14fc-48b5-b072-319e2f45207e-konnectivity-ca\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.361605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45rrd\" (UniqueName: \"kubernetes.io/projected/10696293-8e1c-431c-8a24-8d0bfab036d1-kube-api-access-45rrd\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.361605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-socket-dir-parent\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.361605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-var-lib-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361627 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20075b24-809d-40f9-8a39-d31291dbdc96-ovn-node-metrics-cert\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-host\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cnibin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-bin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-multus\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-systemd-units\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-config\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-sys-fs\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.361853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwmb\" (UniqueName: \"kubernetes.io/projected/1f8ead79-25ed-4501-ab2c-99de1d600ce7-kube-api-access-ljwmb\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-var-lib-kubelet\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cnibin\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1cbb8019-14fc-48b5-b072-319e2f45207e-agent-certs\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361946 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-registration-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x56ww\" (UniqueName: \"kubernetes.io/projected/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-kube-api-access-x56ww\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.361997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-os-release\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-etc-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-bin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-script-lib\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-socket-dir-parent\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cnibin\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysconfig\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-systemd\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-sys-fs\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-host\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-cni-multus\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.362284 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-registration-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-os-release\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-run\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362467 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwf9\" (UniqueName: \"kubernetes.io/projected/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-kube-api-access-6hwf9\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-netns\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362533 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-daemon-config\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-multus-certs\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-systemd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-multus-certs\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-netns\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-node-log\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmnn\" (UniqueName: \"kubernetes.io/projected/64806518-b360-4104-92e5-8a3017ab382a-kube-api-access-brmnn\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-socket-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.363017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-serviceca\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms64s\" (UniqueName: \"kubernetes.io/projected/20075b24-809d-40f9-8a39-d31291dbdc96-kube-api-access-ms64s\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-modprobe-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-socket-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-system-cni-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/dcafdc12-cd15-48ec-90e4-eede66deb4e9-kube-api-access-7dx7n\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.362991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-kubelet\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-log-socket\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95lb5\" (UniqueName: \"kubernetes.io/projected/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-kube-api-access-95lb5\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-var-lib-kubelet\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-k8s-cni-cncf-io\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1cbb8019-14fc-48b5-b072-319e2f45207e-konnectivity-ca\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-daemon-config\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-conf\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-serviceca\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.363764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-lib-modules\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-host-run-k8s-cni-cncf-io\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dcafdc12-cd15-48ec-90e4-eede66deb4e9-iptables-alerter-script\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-device-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-hostroot\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-netns\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-kubernetes\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363466 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-netd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-hostroot\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-conf-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-system-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-device-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.364573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10696293-8e1c-431c-8a24-8d0bfab036d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-etc-kubernetes\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-system-cni-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-multus-conf-dir\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8ead79-25ed-4501-ab2c-99de1d600ce7-etc-kubernetes\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-tmp\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cni-binary-copy\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-kubelet\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-slash\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-os-release\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcafdc12-cd15-48ec-90e4-eede66deb4e9-host-slash\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-ovn\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.363987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-bin\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.364011 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-env-overrides\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.364032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-sys\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.364057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-host\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.365296 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.364244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f8ead79-25ed-4501-ab2c-99de1d600ce7-cni-binary-copy\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.366831 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.366807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1cbb8019-14fc-48b5-b072-319e2f45207e-agent-certs\") pod \"konnectivity-agent-4x4ql\" (UID: \"1cbb8019-14fc-48b5-b072-319e2f45207e\") " pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.370304 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.370285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwmb\" (UniqueName: \"kubernetes.io/projected/1f8ead79-25ed-4501-ab2c-99de1d600ce7-kube-api-access-ljwmb\") pod \"multus-9ffdc\" (UID: \"1f8ead79-25ed-4501-ab2c-99de1d600ce7\") " pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.370410 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.370358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rrd\" (UniqueName: \"kubernetes.io/projected/10696293-8e1c-431c-8a24-8d0bfab036d1-kube-api-access-45rrd\") pod \"aws-ebs-csi-driver-node-vfdxm\" (UID: \"10696293-8e1c-431c-8a24-8d0bfab036d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.370780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.370759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x56ww\" (UniqueName: \"kubernetes.io/projected/d48f7502-3de3-4ca9-92d5-5eaf5e999c97-kube-api-access-x56ww\") pod \"node-ca-cn84g\" (UID: \"d48f7502-3de3-4ca9-92d5-5eaf5e999c97\") " pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.464628 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-var-lib-kubelet\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.464787 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-var-lib-kubelet\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.464787 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cnibin\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.464787 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-etc-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cnibin\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-script-lib\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysconfig\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-etc-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-systemd\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-run\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwf9\" (UniqueName: \"kubernetes.io/projected/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-kube-api-access-6hwf9\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.464946 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-systemd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-node-log\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.464986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brmnn\" (UniqueName: \"kubernetes.io/projected/64806518-b360-4104-92e5-8a3017ab382a-kube-api-access-brmnn\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms64s\" (UniqueName: \"kubernetes.io/projected/20075b24-809d-40f9-8a39-d31291dbdc96-kube-api-access-ms64s\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-modprobe-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-system-cni-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/dcafdc12-cd15-48ec-90e4-eede66deb4e9-kube-api-access-7dx7n\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-log-socket\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95lb5\" (UniqueName: \"kubernetes.io/projected/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-kube-api-access-95lb5\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-run\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-conf\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-lib-modules\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-modprobe-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.465300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dcafdc12-cd15-48ec-90e4-eede66deb4e9-iptables-alerter-script\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-netns\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-node-log\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-systemd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-script-lib\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-systemd\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-kubernetes\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysconfig\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-log-socket\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-lib-modules\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-conf\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-netns\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-netd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.465734 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:59.466063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.465838 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:45:59.965789776 +0000 UTC m=+3.105641093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-kubernetes\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-netd\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465835 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465860 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-system-cni-dir\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-tmp\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.465982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-kubelet\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-slash\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-os-release\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-kubelet\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcafdc12-cd15-48ec-90e4-eede66deb4e9-host-slash\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-ovn\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-slash\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-bin\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.466930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-env-overrides\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcafdc12-cd15-48ec-90e4-eede66deb4e9-host-slash\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-sys\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-host\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-os-release\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-tuned\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dcafdc12-cd15-48ec-90e4-eede66deb4e9-iptables-alerter-script\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-run-ovn\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-var-lib-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20075b24-809d-40f9-8a39-d31291dbdc96-ovn-node-metrics-cert\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-sys\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-host\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-var-lib-openvswitch\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.467696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-cni-bin\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-sysctl-d\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-systemd-units\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-config\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-host-run-ovn-kubernetes\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.466586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20075b24-809d-40f9-8a39-d31291dbdc96-systemd-units\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.467016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-ovnkube-config\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.467149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20075b24-809d-40f9-8a39-d31291dbdc96-env-overrides\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.468475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.467270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.469104 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.469080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-tmp\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.469184 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.469112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-etc-tuned\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.469898 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.469874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20075b24-809d-40f9-8a39-d31291dbdc96-ovn-node-metrics-cert\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.471882 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.471851 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:59.471882 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.471876 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:59.472052 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.471888 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:59.472052 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.471952 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:59.971935836 +0000 UTC m=+3.111787169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:59.474436 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.474410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dx7n\" (UniqueName: \"kubernetes.io/projected/dcafdc12-cd15-48ec-90e4-eede66deb4e9-kube-api-access-7dx7n\") pod \"iptables-alerter-zwm9t\" (UID: \"dcafdc12-cd15-48ec-90e4-eede66deb4e9\") " pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.475161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.475029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwf9\" (UniqueName: \"kubernetes.io/projected/200381f5-de50-4d9c-ba7d-aac4abdd4c3d-kube-api-access-6hwf9\") pod \"multus-additional-cni-plugins-dd75m\" (UID: \"200381f5-de50-4d9c-ba7d-aac4abdd4c3d\") " pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.475161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.475121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms64s\" (UniqueName: \"kubernetes.io/projected/20075b24-809d-40f9-8a39-d31291dbdc96-kube-api-access-ms64s\") pod \"ovnkube-node-cf2h2\" (UID: \"20075b24-809d-40f9-8a39-d31291dbdc96\") " pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.475405 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.475385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmnn\" (UniqueName: \"kubernetes.io/projected/64806518-b360-4104-92e5-8a3017ab382a-kube-api-access-brmnn\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.475997 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.475976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95lb5\" (UniqueName: \"kubernetes.io/projected/769626a8-75a8-4f3b-82c6-400dd0e7d7cd-kube-api-access-95lb5\") pod \"tuned-hrnvq\" (UID: \"769626a8-75a8-4f3b-82c6-400dd0e7d7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.557308 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.557233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:45:59.564954 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.564929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" Apr 18 02:45:59.573534 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.573511 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cn84g" Apr 18 02:45:59.581066 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.581048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9ffdc" Apr 18 02:45:59.587578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.587561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dd75m" Apr 18 02:45:59.596151 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.596131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:45:59.601613 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.601594 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" Apr 18 02:45:59.609102 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.609084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zwm9t" Apr 18 02:45:59.710434 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.710407 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:59.718339 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.718319 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:59.970211 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:45:59.970193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:45:59.970302 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.970291 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:59.970348 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:45:59.970340 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:46:00.970327613 +0000 UTC m=+4.110178939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:59.976641 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.976417 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f8ead79_25ed_4501_ab2c_99de1d600ce7.slice/crio-6e28fec9a5be3f473e205dfd35aedda98a5ba6e8c7e1358884aa5ace920b46fe WatchSource:0}: Error finding container 6e28fec9a5be3f473e205dfd35aedda98a5ba6e8c7e1358884aa5ace920b46fe: Status 404 returned error can't find the container with id 6e28fec9a5be3f473e205dfd35aedda98a5ba6e8c7e1358884aa5ace920b46fe Apr 18 02:45:59.979937 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.979912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cbb8019_14fc_48b5_b072_319e2f45207e.slice/crio-83c6079ebc6c691cb5b051c81d6fa829c1e6b5bff71c2d07112f1743658c2a52 WatchSource:0}: Error finding container 83c6079ebc6c691cb5b051c81d6fa829c1e6b5bff71c2d07112f1743658c2a52: Status 404 returned error can't find the container with id 83c6079ebc6c691cb5b051c81d6fa829c1e6b5bff71c2d07112f1743658c2a52 Apr 18 02:45:59.981975 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.981923 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769626a8_75a8_4f3b_82c6_400dd0e7d7cd.slice/crio-d58ff22927889bbe4f899e247da47464a0f80e6ff59d60897f94312e98bf521f WatchSource:0}: Error finding container d58ff22927889bbe4f899e247da47464a0f80e6ff59d60897f94312e98bf521f: Status 404 returned error can't find the container with id d58ff22927889bbe4f899e247da47464a0f80e6ff59d60897f94312e98bf521f Apr 18 02:45:59.982832 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.982806 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10696293_8e1c_431c_8a24_8d0bfab036d1.slice/crio-354b4e0d67ca0ad61e7146a182e41d40cd929830e78f310219e0e3ede4816c41 WatchSource:0}: Error finding container 354b4e0d67ca0ad61e7146a182e41d40cd929830e78f310219e0e3ede4816c41: Status 404 returned error can't find the container with id 354b4e0d67ca0ad61e7146a182e41d40cd929830e78f310219e0e3ede4816c41 Apr 18 02:45:59.984480 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.984433 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod200381f5_de50_4d9c_ba7d_aac4abdd4c3d.slice/crio-07bcd0f7fc6cdcee8ddb7892f04e247fbdf4d3d5aa25b47cdce4182f38dcb224 WatchSource:0}: Error finding container 07bcd0f7fc6cdcee8ddb7892f04e247fbdf4d3d5aa25b47cdce4182f38dcb224: Status 404 returned error can't find the container with id 07bcd0f7fc6cdcee8ddb7892f04e247fbdf4d3d5aa25b47cdce4182f38dcb224 Apr 18 02:45:59.985379 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.985354 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20075b24_809d_40f9_8a39_d31291dbdc96.slice/crio-a8ba623fa8b1160973496867b2f9b74f713729b67c5dbfcfde7936d2becd8da7 WatchSource:0}: Error finding container a8ba623fa8b1160973496867b2f9b74f713729b67c5dbfcfde7936d2becd8da7: Status 404 returned error can't find the container with id a8ba623fa8b1160973496867b2f9b74f713729b67c5dbfcfde7936d2becd8da7 Apr 18 02:45:59.987386 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.987358 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcafdc12_cd15_48ec_90e4_eede66deb4e9.slice/crio-a102c0953a03355bf5d442153e09466ed91a506ae9d21b8cd9a27f635a3ce705 WatchSource:0}: Error finding container a102c0953a03355bf5d442153e09466ed91a506ae9d21b8cd9a27f635a3ce705: Status 404 returned error can't find the container with id a102c0953a03355bf5d442153e09466ed91a506ae9d21b8cd9a27f635a3ce705 Apr 18 02:45:59.989444 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:45:59.989318 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48f7502_3de3_4ca9_92d5_5eaf5e999c97.slice/crio-4e4b99dee68f5bfd17cffd93707d91a9dd4b2a14f873cf9c818e066e857b912a WatchSource:0}: Error finding container 4e4b99dee68f5bfd17cffd93707d91a9dd4b2a14f873cf9c818e066e857b912a: Status 404 returned error can't find the container with id 4e4b99dee68f5bfd17cffd93707d91a9dd4b2a14f873cf9c818e066e857b912a Apr 18 02:46:00.070685 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.070662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:00.070790 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.070776 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:00.070840 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.070793 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:00.070840 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.070802 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:00.070912 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.070845 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:01.070832647 +0000 UTC m=+4.210683974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:00.287174 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.287130 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:40:58 +0000 UTC" deadline="2027-12-16 11:19:27.143165191 +0000 UTC" Apr 18 02:46:00.287174 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.287168 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14576h33m26.856000686s" Apr 18 02:46:00.368622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.368538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerStarted","Data":"07bcd0f7fc6cdcee8ddb7892f04e247fbdf4d3d5aa25b47cdce4182f38dcb224"} Apr 18 02:46:00.370743 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.370704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4x4ql" event={"ID":"1cbb8019-14fc-48b5-b072-319e2f45207e","Type":"ContainerStarted","Data":"83c6079ebc6c691cb5b051c81d6fa829c1e6b5bff71c2d07112f1743658c2a52"} Apr 18 02:46:00.378352 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.377820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" event={"ID":"5f163ec50d423dcd51089184ab62a7d6","Type":"ContainerStarted","Data":"c319a09a76e82abd0dd50a01c1232fdfad6707831ae2eabd746d69c09906a474"} Apr 18 02:46:00.385707 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.385657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zwm9t" event={"ID":"dcafdc12-cd15-48ec-90e4-eede66deb4e9","Type":"ContainerStarted","Data":"a102c0953a03355bf5d442153e09466ed91a506ae9d21b8cd9a27f635a3ce705"} Apr 18 02:46:00.393023 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.392984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" event={"ID":"10696293-8e1c-431c-8a24-8d0bfab036d1","Type":"ContainerStarted","Data":"354b4e0d67ca0ad61e7146a182e41d40cd929830e78f310219e0e3ede4816c41"} Apr 18 02:46:00.403995 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.403952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" event={"ID":"769626a8-75a8-4f3b-82c6-400dd0e7d7cd","Type":"ContainerStarted","Data":"d58ff22927889bbe4f899e247da47464a0f80e6ff59d60897f94312e98bf521f"} Apr 18 02:46:00.408477 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.408444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9ffdc" event={"ID":"1f8ead79-25ed-4501-ab2c-99de1d600ce7","Type":"ContainerStarted","Data":"6e28fec9a5be3f473e205dfd35aedda98a5ba6e8c7e1358884aa5ace920b46fe"} Apr 18 02:46:00.410391 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.410339 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cn84g" event={"ID":"d48f7502-3de3-4ca9-92d5-5eaf5e999c97","Type":"ContainerStarted","Data":"4e4b99dee68f5bfd17cffd93707d91a9dd4b2a14f873cf9c818e066e857b912a"} Apr 18 02:46:00.413641 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.413591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"a8ba623fa8b1160973496867b2f9b74f713729b67c5dbfcfde7936d2becd8da7"} Apr 18 02:46:00.978032 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:00.977995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:00.978213 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.978178 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:00.978317 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:00.978237 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:46:02.978218739 +0000 UTC m=+6.118070075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:01.078970 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.078885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:01.079186 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.079078 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:01.079186 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.079106 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:01.079186 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.079122 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:01.079354 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.079194 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:03.079173428 +0000 UTC m=+6.219024746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:01.361105 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.360403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:01.361105 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.360532 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:01.361105 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.360925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:01.361105 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:01.361007 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:01.421926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.421770 2577 generic.go:358] "Generic (PLEG): container finished" podID="104dca8b2a35645e419e747d78276979" containerID="b4dda75fcd73438d5ad32042ebeb1f6ec440397a769275e19f9bc43dc74a0bcc" exitCode=0 Apr 18 02:46:01.422687 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.422660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" event={"ID":"104dca8b2a35645e419e747d78276979","Type":"ContainerDied","Data":"b4dda75fcd73438d5ad32042ebeb1f6ec440397a769275e19f9bc43dc74a0bcc"} Apr 18 02:46:01.435118 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:01.435057 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-79.ec2.internal" podStartSLOduration=3.435041445 podStartE2EDuration="3.435041445s" podCreationTimestamp="2026-04-18 02:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:00.392881719 +0000 UTC m=+3.532733055" watchObservedRunningTime="2026-04-18 02:46:01.435041445 +0000 UTC m=+4.574892781" Apr 18 02:46:02.427576 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:02.427274 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" event={"ID":"104dca8b2a35645e419e747d78276979","Type":"ContainerStarted","Data":"1a0a62be8a34b54d085c257857b95a8a4a9b2cf3655581634c9149f94b883ab0"} Apr 18 02:46:02.440787 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:02.440447 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-79.ec2.internal" podStartSLOduration=4.440430235 podStartE2EDuration="4.440430235s" podCreationTimestamp="2026-04-18 02:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:02.440306262 +0000 UTC m=+5.580157623" watchObservedRunningTime="2026-04-18 02:46:02.440430235 +0000 UTC m=+5.580281570" Apr 18 02:46:02.996381 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:02.995826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:02.996381 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:02.995980 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:02.996381 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:02.996042 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:46:06.996022519 +0000 UTC m=+10.135873846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:03.040706 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.039978 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7fmtp"] Apr 18 02:46:03.043248 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.043225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.043384 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.043299 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:03.096978 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.096944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-kubelet-config\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.097121 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.096995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.097121 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.097055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-dbus\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.097121 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.097094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:03.097294 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.097217 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:03.097294 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.097232 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:03.097294 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.097244 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:03.097432 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.097301 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:07.097282008 +0000 UTC m=+10.237133349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:03.197531 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.197494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-dbus\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.197720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.197609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-kubelet-config\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.197720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.197639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.197720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.197703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-dbus\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.197880 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.197752 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:03.197880 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.197769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b37c426b-d777-4a11-9630-cc3f589672b0-kubelet-config\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.197880 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.197807 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:03.697789039 +0000 UTC m=+6.837640355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:03.358458 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.358385 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:03.358616 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.358519 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:03.358884 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.358386 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:03.359001 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.358977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:03.702333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:03.702237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:03.702781 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.702383 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:03.702781 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:03.702443 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:04.70242573 +0000 UTC m=+7.842277044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:04.358304 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.358271 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:04.358481 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:04.358405 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:04.711007 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.710909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:04.711476 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:04.711054 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:04.711476 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:04.711119 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:06.711100214 +0000 UTC m=+9.850951528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:04.884584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.884531 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qvkw2"] Apr 18 02:46:04.888748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.888727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:04.891868 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.891843 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g2fpl\"" Apr 18 02:46:04.892114 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.892098 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 18 02:46:04.893081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.893061 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 18 02:46:04.912579 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.912416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/525e6e89-8bf5-472a-bde7-bfb1254515af-tmp-dir\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:04.912579 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.912480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/525e6e89-8bf5-472a-bde7-bfb1254515af-hosts-file\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:04.912579 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:04.912508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqr9\" (UniqueName: \"kubernetes.io/projected/525e6e89-8bf5-472a-bde7-bfb1254515af-kube-api-access-6kqr9\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.013850 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.013816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/525e6e89-8bf5-472a-bde7-bfb1254515af-tmp-dir\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.014021 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.013877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/525e6e89-8bf5-472a-bde7-bfb1254515af-hosts-file\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.014021 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.013903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqr9\" (UniqueName: \"kubernetes.io/projected/525e6e89-8bf5-472a-bde7-bfb1254515af-kube-api-access-6kqr9\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.014217 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.014191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/525e6e89-8bf5-472a-bde7-bfb1254515af-hosts-file\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.015045 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.015017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/525e6e89-8bf5-472a-bde7-bfb1254515af-tmp-dir\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.022688 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.022642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqr9\" (UniqueName: \"kubernetes.io/projected/525e6e89-8bf5-472a-bde7-bfb1254515af-kube-api-access-6kqr9\") pod \"node-resolver-qvkw2\" (UID: \"525e6e89-8bf5-472a-bde7-bfb1254515af\") " pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.203237 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.203203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qvkw2" Apr 18 02:46:05.358259 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.358057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:05.358259 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:05.358190 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:05.358444 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:05.358271 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:05.358444 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:05.358372 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:06.358358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:06.358326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:06.358828 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:06.358471 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:06.728394 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:06.727750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:06.728394 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:06.727911 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:06.728394 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:06.727974 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:10.727955098 +0000 UTC m=+13.867806415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:07.030808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:07.030678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:07.030997 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.030866 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:07.030997 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.030934 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.030914953 +0000 UTC m=+18.170766268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:07.132480 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:07.131894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:07.132480 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.132052 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:07.132480 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.132069 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:07.132480 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.132083 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:07.132480 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.132133 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.132115778 +0000 UTC m=+18.271967098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:07.359378 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:07.358943 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:07.359378 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.359055 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:07.359378 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:07.359107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:07.359378 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:07.359183 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:08.358622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:08.358591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:08.358771 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:08.358715 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:09.358233 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:09.358196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:09.358669 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:09.358316 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:09.358669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:09.358368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:09.358669 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:09.358491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:10.357985 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:10.357955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:10.358167 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:10.358050 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:10.765192 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:10.765160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:10.765595 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:10.765331 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:10.765595 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:10.765420 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:18.76540167 +0000 UTC m=+21.905252996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:11.358265 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:11.358235 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:11.358453 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:11.358235 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:11.358453 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:11.358344 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:11.358571 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:11.358459 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:12.357830 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:12.357742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:12.358246 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:12.357861 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:13.357892 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:13.357860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:13.358354 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:13.358007 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:13.358354 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:13.358070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:13.358354 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:13.358176 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:14.358408 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:14.358372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:14.358809 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:14.358496 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:15.101239 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:15.101201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:15.101410 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.101393 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:15.101479 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.101467 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:46:31.101444608 +0000 UTC m=+34.241295928 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:15.201793 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:15.201758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:15.201958 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.201876 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:15.201958 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.201894 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:15.201958 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.201908 2577 projected.go:194] Error preparing data for projected volume kube-api-access-9rwml for pod openshift-network-diagnostics/network-check-target-94m6z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:15.202089 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.201963 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml podName:0ec2b026-c157-4159-99e6-03c6327b4c38 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:31.201944869 +0000 UTC m=+34.341796183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9rwml" (UniqueName: "kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml") pod "network-check-target-94m6z" (UID: "0ec2b026-c157-4159-99e6-03c6327b4c38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:15.358464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:15.358392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:15.358888 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:15.358395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:15.358888 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.358508 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:15.358888 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:15.358610 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:16.357970 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:16.357940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:16.358175 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:16.358032 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:16.707540 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:46:16.707503 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525e6e89_8bf5_472a_bde7_bfb1254515af.slice/crio-d933204c521fe8c1266e8bc312243832ee6ae79ec3dd6a8133015fea78adfe6a WatchSource:0}: Error finding container d933204c521fe8c1266e8bc312243832ee6ae79ec3dd6a8133015fea78adfe6a: Status 404 returned error can't find the container with id d933204c521fe8c1266e8bc312243832ee6ae79ec3dd6a8133015fea78adfe6a Apr 18 02:46:17.359455 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.359175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:17.359578 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:17.359515 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:17.359578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.359256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:17.359642 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:17.359601 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:17.452484 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.452456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" event={"ID":"10696293-8e1c-431c-8a24-8d0bfab036d1","Type":"ContainerStarted","Data":"a84bdbd560005f4a604ad59b219d5aa96c4695519ef2f9fee79203b6873b1e64"} Apr 18 02:46:17.453892 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.453864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" event={"ID":"769626a8-75a8-4f3b-82c6-400dd0e7d7cd","Type":"ContainerStarted","Data":"b18833cda3e69d904e809c67a1b7d47ef488cc5553b205396eda6e6ce97d75f9"} Apr 18 02:46:17.455288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.455266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9ffdc" event={"ID":"1f8ead79-25ed-4501-ab2c-99de1d600ce7","Type":"ContainerStarted","Data":"5c3c019731d1486f8a7a34268070f0d2f5a0ea9487799e1eea8d0cd3809974af"} Apr 18 02:46:17.456599 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.456579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qvkw2" event={"ID":"525e6e89-8bf5-472a-bde7-bfb1254515af","Type":"ContainerStarted","Data":"af51f02191c5d6db32ab1021b2e63c9e3acbfbb8dd2ba8bcc68364a26549edc3"} Apr 18 02:46:17.456722 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.456703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qvkw2" event={"ID":"525e6e89-8bf5-472a-bde7-bfb1254515af","Type":"ContainerStarted","Data":"d933204c521fe8c1266e8bc312243832ee6ae79ec3dd6a8133015fea78adfe6a"} Apr 18 02:46:17.458070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.458049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cn84g" event={"ID":"d48f7502-3de3-4ca9-92d5-5eaf5e999c97","Type":"ContainerStarted","Data":"fafe6a2af1b5cfa39c8651352be15f43358492f8dda062b9256b486a32dd5b52"} Apr 18 02:46:17.460072 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.460051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:46:17.460439 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.460414 2577 generic.go:358] "Generic (PLEG): container finished" podID="20075b24-809d-40f9-8a39-d31291dbdc96" containerID="6c2ceab044a2c4058e13a0912bfe76cfee962a688603eab2665640945b086e31" exitCode=1 Apr 18 02:46:17.460538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.460487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"4a053eff17fc8e111c34be21c4cdc693aeef71e2ad07629def7d5f6d9c5dce79"} Apr 18 02:46:17.460538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.460515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerDied","Data":"6c2ceab044a2c4058e13a0912bfe76cfee962a688603eab2665640945b086e31"} Apr 18 02:46:17.460538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.460530 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"93ceb4c226afa84c88851e3c3dbbe3d8c423969f698542dbfa1a73e6a4cb1839"} Apr 18 02:46:17.461839 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.461809 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="89242bc4c151e58e9c20c41e65d98d0b19dbfbc4eb70b42a108a1bb6c76a6d2f" exitCode=0 Apr 18 02:46:17.462165 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.462142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"89242bc4c151e58e9c20c41e65d98d0b19dbfbc4eb70b42a108a1bb6c76a6d2f"} Apr 18 02:46:17.463622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.463592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4x4ql" event={"ID":"1cbb8019-14fc-48b5-b072-319e2f45207e","Type":"ContainerStarted","Data":"b9a90646ab14bf9795d43094f97d60bdc74e6a49aa653a3d5f749c04dbcbdf75"} Apr 18 02:46:17.470610 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.470504 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hrnvq" podStartSLOduration=3.695848688 podStartE2EDuration="20.47049116s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.98382097 +0000 UTC m=+3.123672298" lastFinishedPulling="2026-04-18 02:46:16.758463442 +0000 UTC m=+19.898314770" observedRunningTime="2026-04-18 02:46:17.469933034 +0000 UTC m=+20.609784370" watchObservedRunningTime="2026-04-18 02:46:17.47049116 +0000 UTC m=+20.610342497" Apr 18 02:46:17.485311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.485269 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9ffdc" podStartSLOduration=3.688724423 podStartE2EDuration="20.48525494s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.978187407 +0000 UTC m=+3.118038722" lastFinishedPulling="2026-04-18 02:46:16.774717913 +0000 UTC m=+19.914569239" observedRunningTime="2026-04-18 02:46:17.484847967 +0000 UTC m=+20.624699303" watchObservedRunningTime="2026-04-18 02:46:17.48525494 +0000 UTC m=+20.625106276" Apr 18 02:46:17.518779 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.518738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4x4ql" podStartSLOduration=11.848030653 podStartE2EDuration="20.518722002s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.981666418 +0000 UTC m=+3.121517731" lastFinishedPulling="2026-04-18 02:46:08.652357755 +0000 UTC m=+11.792209080" observedRunningTime="2026-04-18 02:46:17.518124735 +0000 UTC m=+20.657976070" watchObservedRunningTime="2026-04-18 02:46:17.518722002 +0000 UTC m=+20.658573338" Apr 18 02:46:17.535828 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:17.535768 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cn84g" podStartSLOduration=3.821808493 podStartE2EDuration="20.535753345s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.991451427 +0000 UTC m=+3.131302740" lastFinishedPulling="2026-04-18 02:46:16.705396276 +0000 UTC m=+19.845247592" observedRunningTime="2026-04-18 02:46:17.535349456 +0000 UTC m=+20.675200788" watchObservedRunningTime="2026-04-18 02:46:17.535753345 +0000 UTC m=+20.675604679" Apr 18 02:46:18.179591 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.179401 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 18 02:46:18.335171 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.335083 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-18T02:46:18.179586934Z","UUID":"23c0c9a3-8a1f-4b29-a249-602722f248f7","Handler":null,"Name":"","Endpoint":""} Apr 18 02:46:18.337747 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.337726 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 18 02:46:18.337882 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.337762 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 18 02:46:18.357678 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.357620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:18.357776 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:18.357731 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:18.467785 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.467758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:46:18.468147 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.468106 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"14305c3a6d29c33532b7674d2223ec2973071934ceac008490dc533b42c74865"} Apr 18 02:46:18.468147 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.468148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"46f0f74f7a0556ba2ac195f9d30d126b9974e0ca203aad9c43cefc580d3b320a"} Apr 18 02:46:18.468333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.468163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"93636e8421cc427d1552d30c29d9a1baa17e35f369d16bfb6d36a0a0d7299044"} Apr 18 02:46:18.469522 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.469491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zwm9t" event={"ID":"dcafdc12-cd15-48ec-90e4-eede66deb4e9","Type":"ContainerStarted","Data":"c83c2e47d51c69792a842250fc1ccf674da3b46827c1063ba59ef13b4b5ab6ad"} Apr 18 02:46:18.471406 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.471378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" event={"ID":"10696293-8e1c-431c-8a24-8d0bfab036d1","Type":"ContainerStarted","Data":"002fb2efabd68d9a127542a4e27fae4bbadf7df9cd80add9177ed07050c95864"} Apr 18 02:46:18.482416 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.482368 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zwm9t" podStartSLOduration=4.713758525 podStartE2EDuration="21.482351957s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.98987111 +0000 UTC m=+3.129722425" lastFinishedPulling="2026-04-18 02:46:16.75846453 +0000 UTC m=+19.898315857" observedRunningTime="2026-04-18 02:46:18.482244497 +0000 UTC m=+21.622095835" watchObservedRunningTime="2026-04-18 02:46:18.482351957 +0000 UTC m=+21.622203293" Apr 18 02:46:18.482529 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.482466 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qvkw2" podStartSLOduration=14.482458819 podStartE2EDuration="14.482458819s" podCreationTimestamp="2026-04-18 02:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:17.548798883 +0000 UTC m=+20.688650222" watchObservedRunningTime="2026-04-18 02:46:18.482458819 +0000 UTC m=+21.622310155" Apr 18 02:46:18.830811 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:18.830778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:18.830988 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:18.830947 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:18.831055 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:18.831021 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret podName:b37c426b-d777-4a11-9630-cc3f589672b0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:34.831000196 +0000 UTC m=+37.970851515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret") pod "global-pull-secret-syncer-7fmtp" (UID: "b37c426b-d777-4a11-9630-cc3f589672b0") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:19.357985 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:19.357939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:19.358654 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:19.357939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:19.358654 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:19.358069 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:19.358654 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:19.358181 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:20.357694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:20.357664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:20.357884 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:20.357780 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:20.478061 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:20.478025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:46:20.478697 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:20.478439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"f66512bdcc63e8fe2abf1338fb6311ba06a489d9d3215d5b84d03485ba0ecd64"} Apr 18 02:46:20.480246 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:20.480223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" event={"ID":"10696293-8e1c-431c-8a24-8d0bfab036d1","Type":"ContainerStarted","Data":"7678a450599140d59b92d59ec7bc01f4ee2fd470048aef36c91f001a54ceb2cf"} Apr 18 02:46:20.498325 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:20.498277 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vfdxm" podStartSLOduration=4.116371954 podStartE2EDuration="23.498265664s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.984626123 +0000 UTC m=+3.124477439" lastFinishedPulling="2026-04-18 02:46:19.366519825 +0000 UTC m=+22.506371149" observedRunningTime="2026-04-18 02:46:20.498100448 +0000 UTC m=+23.637951783" watchObservedRunningTime="2026-04-18 02:46:20.498265664 +0000 UTC m=+23.638116997" Apr 18 02:46:21.358220 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:21.358187 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:21.358442 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:21.358321 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:21.358442 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:21.358378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:21.358576 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:21.358497 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:21.594936 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:21.594906 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:46:21.595970 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:21.595947 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:46:22.358377 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.358166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:22.358517 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:22.358394 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:22.485968 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.485945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:46:22.486324 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.486301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"8a73005b0b5a3003fc8a752228fdde93e674fd88cd7a9f22aab03037a571b0e8"} Apr 18 02:46:22.486580 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.486545 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:46:22.486580 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.486585 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:46:22.486760 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.486746 2577 scope.go:117] "RemoveContainer" containerID="6c2ceab044a2c4058e13a0912bfe76cfee962a688603eab2665640945b086e31" Apr 18 02:46:22.488061 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.488037 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="88a2787b262bf9cc8cf1c53a7aa6eddc5a4bd353109a88b07c48a8fda0077393" exitCode=0 Apr 18 02:46:22.488146 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.488107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"88a2787b262bf9cc8cf1c53a7aa6eddc5a4bd353109a88b07c48a8fda0077393"} Apr 18 02:46:22.488254 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.488236 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:46:22.488728 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.488713 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4x4ql" Apr 18 02:46:22.501720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:22.501704 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:46:23.358458 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.358423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:23.358922 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.358423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:23.358922 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:23.358591 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:23.358922 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:23.358621 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:23.492671 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.492644 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:46:23.492962 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.492936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" event={"ID":"20075b24-809d-40f9-8a39-d31291dbdc96","Type":"ContainerStarted","Data":"b19b8937c626da1c5c63c856c5b4424d72ccfac68b635191b5fc6e3983ab4b42"} Apr 18 02:46:23.493266 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.493238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:46:23.506362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.506343 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:46:23.520119 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:23.520082 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" podStartSLOduration=9.693359707 podStartE2EDuration="26.52007118s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.989668188 +0000 UTC m=+3.129519513" lastFinishedPulling="2026-04-18 02:46:16.816379673 +0000 UTC m=+19.956230986" observedRunningTime="2026-04-18 02:46:23.519746333 +0000 UTC m=+26.659597667" watchObservedRunningTime="2026-04-18 02:46:23.52007118 +0000 UTC m=+26.659922511" Apr 18 02:46:24.110957 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.110721 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7fmtp"] Apr 18 02:46:24.111096 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.111056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:24.111155 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:24.111134 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:24.113940 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.113915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6xc88"] Apr 18 02:46:24.114075 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.114025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:24.114139 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:24.114119 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:24.114523 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.114504 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-94m6z"] Apr 18 02:46:24.114634 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.114622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:24.114733 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:24.114715 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:24.496695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.496661 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="e3b281fa8703ba7cd63961f90bb7ec08a387846739152592c58172cc2416514e" exitCode=0 Apr 18 02:46:24.497148 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:24.496735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"e3b281fa8703ba7cd63961f90bb7ec08a387846739152592c58172cc2416514e"} Apr 18 02:46:25.358393 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:25.358364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:25.358571 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:25.358475 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:26.357877 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:26.357848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:26.358330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:26.357851 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:26.358330 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:26.357936 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:26.358330 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:26.358035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:26.502741 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:26.502703 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="599d8a39cb9c890810b95a6b6c14e72172f087e8694ecf00d19fba7b2fed2855" exitCode=0 Apr 18 02:46:26.502880 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:26.502749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"599d8a39cb9c890810b95a6b6c14e72172f087e8694ecf00d19fba7b2fed2855"} Apr 18 02:46:27.360798 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:27.360768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:27.361411 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:27.360870 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:28.357907 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:28.357870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:28.358054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:28.357870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:28.358054 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:28.358000 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-94m6z" podUID="0ec2b026-c157-4159-99e6-03c6327b4c38" Apr 18 02:46:28.358174 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:28.358058 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:46:29.358444 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.358418 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:29.358805 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:29.358529 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7fmtp" podUID="b37c426b-d777-4a11-9630-cc3f589672b0" Apr 18 02:46:29.703863 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.703766 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-79.ec2.internal" event="NodeReady" Apr 18 02:46:29.704078 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.703935 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 18 02:46:29.737156 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.737126 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:46:29.758099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.758075 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9l4pp"] Apr 18 02:46:29.758279 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.758258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.760765 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.760736 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 18 02:46:29.760959 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.760942 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 18 02:46:29.761347 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.761327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 18 02:46:29.761573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.761528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxh9b\"" Apr 18 02:46:29.766618 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.766597 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 18 02:46:29.772622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.772600 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ms6tq"] Apr 18 02:46:29.772758 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.772742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:29.775068 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.775048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 18 02:46:29.775156 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.775057 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:46:29.775156 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.775095 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 18 02:46:29.807801 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.807781 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:46:29.807801 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.807803 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9l4pp"] Apr 18 02:46:29.807912 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.807812 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ms6tq"] Apr 18 02:46:29.807912 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.807894 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:29.810285 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.810234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 18 02:46:29.810433 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.810306 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 18 02:46:29.810524 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.810509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 18 02:46:29.810705 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.810571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:46:29.917530 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vlw\" (UniqueName: \"kubernetes.io/projected/87aa1c86-8143-4fdc-b899-17184a387dcf-kube-api-access-t5vlw\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:29.917692 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917692 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj9h4\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917692 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917692 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917893 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87aa1c86-8143-4fdc-b899-17184a387dcf-config-volume\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:29.917893 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917893 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.917893 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917887 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.918026 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:29.918026 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:29.918026 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.917981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9c4\" (UniqueName: \"kubernetes.io/projected/17f79784-a585-4ce4-ae11-e420d136c2d0-kube-api-access-wt9c4\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:29.918026 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.918015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87aa1c86-8143-4fdc-b899-17184a387dcf-tmp-dir\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:29.918221 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:29.918063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:30.018852 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.018852 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj9h4\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87aa1c86-8143-4fdc-b899-17184a387dcf-config-volume\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.018986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019057 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:30.019092 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019081 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019150 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:30.519129446 +0000 UTC m=+33.658980763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9c4\" (UniqueName: \"kubernetes.io/projected/17f79784-a585-4ce4-ae11-e420d136c2d0-kube-api-access-wt9c4\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87aa1c86-8143-4fdc-b899-17184a387dcf-tmp-dir\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:30.019451 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vlw\" (UniqueName: \"kubernetes.io/projected/87aa1c86-8143-4fdc-b899-17184a387dcf-kube-api-access-t5vlw\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019473 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019524 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:46:30.519505994 +0000 UTC m=+33.659357322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87aa1c86-8143-4fdc-b899-17184a387dcf-tmp-dir\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87aa1c86-8143-4fdc-b899-17184a387dcf-config-volume\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.019687 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019679 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:30.019942 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.019745 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:30.519729574 +0000 UTC m=+33.659580896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:30.019942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.019931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.023833 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.023810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.023968 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.023815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.027400 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.027375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vlw\" (UniqueName: \"kubernetes.io/projected/87aa1c86-8143-4fdc-b899-17184a387dcf-kube-api-access-t5vlw\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.027688 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.027653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.027803 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.027781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj9h4\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.028115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.028099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9c4\" (UniqueName: \"kubernetes.io/projected/17f79784-a585-4ce4-ae11-e420d136c2d0-kube-api-access-wt9c4\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:30.030530 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.030506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.358305 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.358228 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:30.358454 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.358240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:30.360821 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.360796 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 18 02:46:30.361222 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.360936 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:46:30.361222 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.360940 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 18 02:46:30.361222 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.360974 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 18 02:46:30.361222 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.361023 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nn98\"" Apr 18 02:46:30.521991 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.521959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:30.522175 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.522017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:30.522175 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:30.522055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:30.522175 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522163 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:30.522314 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522194 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:30.522314 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522243 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:46:31.522223123 +0000 UTC m=+34.662074435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:30.522314 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522165 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:30.522314 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522268 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:31.522252092 +0000 UTC m=+34.662103409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:30.522314 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522275 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:30.522520 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:30.522334 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:31.522321274 +0000 UTC m=+34.662172601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:31.126836 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.126803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:46:31.127024 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.126908 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:46:31.127024 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.126970 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:47:03.126951552 +0000 UTC m=+66.266802867 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : secret "metrics-daemon-secret" not found Apr 18 02:46:31.227787 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.227749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:31.230907 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.230880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwml\" (UniqueName: \"kubernetes.io/projected/0ec2b026-c157-4159-99e6-03c6327b4c38-kube-api-access-9rwml\") pod \"network-check-target-94m6z\" (UID: \"0ec2b026-c157-4159-99e6-03c6327b4c38\") " pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:31.269854 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.269828 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:31.358385 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.358355 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:31.361138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.361113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 18 02:46:31.529698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.529670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:31.529886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.529803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:31.529886 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529817 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:31.529886 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529839 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:31.529886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:31.529843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:31.530087 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529912 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:33.529889343 +0000 UTC m=+36.669740664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:31.530087 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529938 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:31.530087 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529982 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:46:33.529970945 +0000 UTC m=+36.669822274 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:31.530087 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.529987 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:31.530087 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:31.530045 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:33.530026807 +0000 UTC m=+36.669878134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:32.147483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:32.147437 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-94m6z"] Apr 18 02:46:32.244569 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:46:32.244491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ec2b026_c157_4159_99e6_03c6327b4c38.slice/crio-36847d57d6e1733285f66fa8fe8d4d19a5e90741cca2d204dc7b22bf64564fbe WatchSource:0}: Error finding container 36847d57d6e1733285f66fa8fe8d4d19a5e90741cca2d204dc7b22bf64564fbe: Status 404 returned error can't find the container with id 36847d57d6e1733285f66fa8fe8d4d19a5e90741cca2d204dc7b22bf64564fbe Apr 18 02:46:32.518637 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:32.518330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerStarted","Data":"83c28efb5291d2a0ef098c3174d901f6840746fc06fdf48757a63b22ce5275c7"} Apr 18 02:46:32.519465 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:32.519438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-94m6z" event={"ID":"0ec2b026-c157-4159-99e6-03c6327b4c38","Type":"ContainerStarted","Data":"36847d57d6e1733285f66fa8fe8d4d19a5e90741cca2d204dc7b22bf64564fbe"} Apr 18 02:46:33.523728 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:33.523693 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="83c28efb5291d2a0ef098c3174d901f6840746fc06fdf48757a63b22ce5275c7" exitCode=0 Apr 18 02:46:33.524100 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:33.523739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"83c28efb5291d2a0ef098c3174d901f6840746fc06fdf48757a63b22ce5275c7"} Apr 18 02:46:33.546179 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:33.546152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:33.546274 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:33.546232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:33.546337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:33.546273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:33.546337 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546304 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:33.546435 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546361 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:33.546435 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546367 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:37.546348595 +0000 UTC m=+40.686199926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:33.546435 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546392 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:46:37.546383067 +0000 UTC m=+40.686234383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:33.546435 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546396 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:33.546435 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546415 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:33.546663 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:33.546461 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:37.546446951 +0000 UTC m=+40.686298267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:34.528933 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:34.528889 2577 generic.go:358] "Generic (PLEG): container finished" podID="200381f5-de50-4d9c-ba7d-aac4abdd4c3d" containerID="45cfaab866f5c2d0e0066977c800710c9c5d9b2f9f042b9e6c117a285265fb62" exitCode=0 Apr 18 02:46:34.529433 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:34.528945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerDied","Data":"45cfaab866f5c2d0e0066977c800710c9c5d9b2f9f042b9e6c117a285265fb62"} Apr 18 02:46:34.857232 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:34.857130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:34.861420 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:34.861391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b37c426b-d777-4a11-9630-cc3f589672b0-original-pull-secret\") pod \"global-pull-secret-syncer-7fmtp\" (UID: \"b37c426b-d777-4a11-9630-cc3f589672b0\") " pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:34.968373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:34.968351 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7fmtp" Apr 18 02:46:35.090154 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.090019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7fmtp"] Apr 18 02:46:35.093137 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:46:35.093108 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37c426b_d777_4a11_9630_cc3f589672b0.slice/crio-f17346a1008f3503d2f6f39574cdeea60a947f4f30968781a07a0884c2e39b64 WatchSource:0}: Error finding container f17346a1008f3503d2f6f39574cdeea60a947f4f30968781a07a0884c2e39b64: Status 404 returned error can't find the container with id f17346a1008f3503d2f6f39574cdeea60a947f4f30968781a07a0884c2e39b64 Apr 18 02:46:35.532066 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.532025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7fmtp" event={"ID":"b37c426b-d777-4a11-9630-cc3f589672b0","Type":"ContainerStarted","Data":"f17346a1008f3503d2f6f39574cdeea60a947f4f30968781a07a0884c2e39b64"} Apr 18 02:46:35.535459 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.535429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dd75m" event={"ID":"200381f5-de50-4d9c-ba7d-aac4abdd4c3d","Type":"ContainerStarted","Data":"cd706893574448ed72bb1caac1ac687a39ad63891717ed99d945d4b5bcf9243a"} Apr 18 02:46:35.537008 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.536987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-94m6z" event={"ID":"0ec2b026-c157-4159-99e6-03c6327b4c38","Type":"ContainerStarted","Data":"a1605d868b3274b45633fa991ae76105075cda864a3b8e31ab73c9b4979c6a60"} Apr 18 02:46:35.537110 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.537095 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:46:35.557488 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.556867 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dd75m" podStartSLOduration=6.2557449609999995 podStartE2EDuration="38.556851242s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:45:59.987751808 +0000 UTC m=+3.127603129" lastFinishedPulling="2026-04-18 02:46:32.288858085 +0000 UTC m=+35.428709410" observedRunningTime="2026-04-18 02:46:35.554723895 +0000 UTC m=+38.694575230" watchObservedRunningTime="2026-04-18 02:46:35.556851242 +0000 UTC m=+38.696702578" Apr 18 02:46:35.568330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:35.568294 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-94m6z" podStartSLOduration=35.653156962 podStartE2EDuration="38.56828289s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:46:32.264203454 +0000 UTC m=+35.404054779" lastFinishedPulling="2026-04-18 02:46:35.179329379 +0000 UTC m=+38.319180707" observedRunningTime="2026-04-18 02:46:35.567687913 +0000 UTC m=+38.707539248" watchObservedRunningTime="2026-04-18 02:46:35.56828289 +0000 UTC m=+38.708134258" Apr 18 02:46:37.575459 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:37.575411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:37.575494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575529 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575562 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.575599565 +0000 UTC m=+48.715450877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575632 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:37.575532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575653 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575694 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.575675787 +0000 UTC m=+48.715527126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:37.575881 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:37.575774 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.575755744 +0000 UTC m=+48.715607057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:39.546844 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:39.546807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7fmtp" event={"ID":"b37c426b-d777-4a11-9630-cc3f589672b0","Type":"ContainerStarted","Data":"7b4f1c34df766a838932c355a7c0b65fc2bbaa608c844337885029de6b70ec20"} Apr 18 02:46:39.561035 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:39.560984 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7fmtp" podStartSLOduration=33.104210988 podStartE2EDuration="36.560969768s" podCreationTimestamp="2026-04-18 02:46:03 +0000 UTC" firstStartedPulling="2026-04-18 02:46:35.094969105 +0000 UTC m=+38.234820418" lastFinishedPulling="2026-04-18 02:46:38.551727872 +0000 UTC m=+41.691579198" observedRunningTime="2026-04-18 02:46:39.560381296 +0000 UTC m=+42.700232632" watchObservedRunningTime="2026-04-18 02:46:39.560969768 +0000 UTC m=+42.700821102" Apr 18 02:46:45.622005 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:45.621969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:45.622016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:45.622043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622108 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622125 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622125 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622167 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622175 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:01.622160036 +0000 UTC m=+64.762011348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622253 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:01.622238565 +0000 UTC m=+64.762089883 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:46:45.622362 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:46:45.622266 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:47:01.62226002 +0000 UTC m=+64.762111332 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:46:55.508966 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:46:55.508939 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cf2h2" Apr 18 02:47:01.624643 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:01.624610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:01.624668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:01.624694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624753 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624776 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624790 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624802 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624812 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:33.624795384 +0000 UTC m=+96.764646697 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624826 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:47:33.624819412 +0000 UTC m=+96.764670725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:47:01.625013 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:01.624836 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:33.624831395 +0000 UTC m=+96.764682708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:47:03.134410 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:03.134364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:47:03.134817 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:03.134482 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:47:03.134817 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:03.134535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:48:07.134520783 +0000 UTC m=+130.274372096 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : secret "metrics-daemon-secret" not found Apr 18 02:47:06.541367 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:06.541329 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-94m6z" Apr 18 02:47:33.642890 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:33.642745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:47:33.642890 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:33.642830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:47:33.642890 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:47:33.642854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:47:33.642890 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.642885 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.642901 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f9c87579f-2s7d5: secret "image-registry-tls" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.642942 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.642964 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.642968 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls podName:f5f88955-bfd0-4343-b6c6-2a18dd0d1149 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:37.642951165 +0000 UTC m=+160.782802478 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls") pod "image-registry-7f9c87579f-2s7d5" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149") : secret "image-registry-tls" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.643020 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert podName:17f79784-a585-4ce4-ae11-e420d136c2d0 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:37.643007699 +0000 UTC m=+160.782859012 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert") pod "ingress-canary-ms6tq" (UID: "17f79784-a585-4ce4-ae11-e420d136c2d0") : secret "canary-serving-cert" not found Apr 18 02:47:33.643422 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:47:33.643030 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls podName:87aa1c86-8143-4fdc-b899-17184a387dcf nodeName:}" failed. No retries permitted until 2026-04-18 02:48:37.643024869 +0000 UTC m=+160.782876182 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls") pod "dns-default-9l4pp" (UID: "87aa1c86-8143-4fdc-b899-17184a387dcf") : secret "dns-default-metrics-tls" not found Apr 18 02:48:07.174203 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:07.174161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:48:07.174705 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:07.174286 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:48:07.174705 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:07.174345 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs podName:64806518-b360-4104-92e5-8a3017ab382a nodeName:}" failed. No retries permitted until 2026-04-18 02:50:09.174327954 +0000 UTC m=+252.314179279 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs") pod "network-metrics-daemon-6xc88" (UID: "64806518-b360-4104-92e5-8a3017ab382a") : secret "metrics-daemon-secret" not found Apr 18 02:48:13.080540 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.080501 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9"] Apr 18 02:48:13.083504 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.083487 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-87df6b698-c7764"] Apr 18 02:48:13.083671 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.083653 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.086253 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.086233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.093148 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 18 02:48:13.093637 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093613 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 18 02:48:13.093734 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 18 02:48:13.093816 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093740 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 18 02:48:13.093816 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093755 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 18 02:48:13.093933 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.093843 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 18 02:48:13.094035 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.094021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 18 02:48:13.094153 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.094133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 18 02:48:13.094500 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.094485 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 18 02:48:13.094812 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.094780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 18 02:48:13.094943 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.094833 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fwq2f\"" Apr 18 02:48:13.095246 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.095232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-v2cqx\"" Apr 18 02:48:13.108402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.108382 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9"] Apr 18 02:48:13.121532 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.121504 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-87df6b698-c7764"] Apr 18 02:48:13.181036 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.181002 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-25mgk"] Apr 18 02:48:13.183739 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.183722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.186263 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.186244 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 18 02:48:13.186263 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.186250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 18 02:48:13.186425 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.186283 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 18 02:48:13.186567 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.186538 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 18 02:48:13.186746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.186732 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wnlb6\"" Apr 18 02:48:13.190997 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.190976 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 18 02:48:13.192477 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.192459 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-25mgk"] Apr 18 02:48:13.217489 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtlp\" (UniqueName: \"kubernetes.io/projected/d50fae41-d0e5-4ab2-8306-73f4036be5d1-kube-api-access-cdtlp\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.217669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-default-certificate\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.217669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-stats-auth\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.217669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.217800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d50fae41-d0e5-4ab2-8306-73f4036be5d1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.217800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.217800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpxp\" (UniqueName: \"kubernetes.io/projected/804c2955-592f-4663-b301-f7f6e6d14909-kube-api-access-zzpxp\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.217913 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.217809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319048 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpxp\" (UniqueName: \"kubernetes.io/projected/804c2955-592f-4663-b301-f7f6e6d14909-kube-api-access-zzpxp\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319166 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319166 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319166 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-snapshots\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319166 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319288 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.319164 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:13.319288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d50fae41-d0e5-4ab2-8306-73f4036be5d1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.319288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7920e7d2-3418-454b-9269-8f13a0c96d2d-serving-cert\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319288 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.319257 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:13.819233223 +0000 UTC m=+136.959084538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:13.319471 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtlp\" (UniqueName: \"kubernetes.io/projected/d50fae41-d0e5-4ab2-8306-73f4036be5d1-kube-api-access-cdtlp\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.319471 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgh67\" (UniqueName: \"kubernetes.io/projected/7920e7d2-3418-454b-9269-8f13a0c96d2d-kube-api-access-fgh67\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319596 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-default-certificate\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319596 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-stats-auth\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-tmp\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.319698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.319698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.319832 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.319746 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:13.319832 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.319759 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:13.819746698 +0000 UTC m=+136.959598014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:13.319832 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.319809 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:13.81979161 +0000 UTC m=+136.959643127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:13.320032 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.319927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d50fae41-d0e5-4ab2-8306-73f4036be5d1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.322712 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.322686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-stats-auth\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.322786 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.322749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-default-certificate\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.328869 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.328849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtlp\" (UniqueName: \"kubernetes.io/projected/d50fae41-d0e5-4ab2-8306-73f4036be5d1-kube-api-access-cdtlp\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.328869 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.328862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpxp\" (UniqueName: \"kubernetes.io/projected/804c2955-592f-4663-b301-f7f6e6d14909-kube-api-access-zzpxp\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.420684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.420684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-snapshots\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.420684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.420874 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7920e7d2-3418-454b-9269-8f13a0c96d2d-serving-cert\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.420874 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgh67\" (UniqueName: \"kubernetes.io/projected/7920e7d2-3418-454b-9269-8f13a0c96d2d-kube-api-access-fgh67\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.420979 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.420875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-tmp\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.421238 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.421211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-tmp\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.421318 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.421279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7920e7d2-3418-454b-9269-8f13a0c96d2d-snapshots\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.421423 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.421406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.421908 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.421888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7920e7d2-3418-454b-9269-8f13a0c96d2d-service-ca-bundle\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.422880 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.422851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7920e7d2-3418-454b-9269-8f13a0c96d2d-serving-cert\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.427990 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.427970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgh67\" (UniqueName: \"kubernetes.io/projected/7920e7d2-3418-454b-9269-8f13a0c96d2d-kube-api-access-fgh67\") pod \"insights-operator-585dfdc468-25mgk\" (UID: \"7920e7d2-3418-454b-9269-8f13a0c96d2d\") " pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.493853 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.493836 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-25mgk" Apr 18 02:48:13.604020 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.603995 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-25mgk"] Apr 18 02:48:13.607074 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:13.607049 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7920e7d2_3418_454b_9269_8f13a0c96d2d.slice/crio-97907cc7a989883bbb338e7403f2a7e08ef6dbaf4cce8210f6504984d3479e26 WatchSource:0}: Error finding container 97907cc7a989883bbb338e7403f2a7e08ef6dbaf4cce8210f6504984d3479e26: Status 404 returned error can't find the container with id 97907cc7a989883bbb338e7403f2a7e08ef6dbaf4cce8210f6504984d3479e26 Apr 18 02:48:13.714536 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.714481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-25mgk" event={"ID":"7920e7d2-3418-454b-9269-8f13a0c96d2d","Type":"ContainerStarted","Data":"97907cc7a989883bbb338e7403f2a7e08ef6dbaf4cce8210f6504984d3479e26"} Apr 18 02:48:13.823711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.823689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.823785 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.823718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:13.823785 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:13.823743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:13.823860 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.823835 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:13.823860 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.823838 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:13.823860 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.823855 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:14.823834756 +0000 UTC m=+137.963686084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:13.823959 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.823876 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:14.823866845 +0000 UTC m=+137.963718161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:13.823959 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:13.823890 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:14.823882336 +0000 UTC m=+137.963733651 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:14.830090 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:14.830059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:14.830090 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:14.830098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:14.830136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:14.830219 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:14.830229 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:16.830208422 +0000 UTC m=+139.970059736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:14.830259 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:14.830277 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:16.83026358 +0000 UTC m=+139.970114898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:14.830539 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:14.830309 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:16.830292933 +0000 UTC m=+139.970144246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:15.718922 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:15.718886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-25mgk" event={"ID":"7920e7d2-3418-454b-9269-8f13a0c96d2d","Type":"ContainerStarted","Data":"d39d21178798cafc7631f51f291fb3c6db044c4c5ddfcffb68ef4d82682d654e"} Apr 18 02:48:15.733792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:15.733740 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-25mgk" podStartSLOduration=0.830178866 podStartE2EDuration="2.73372795s" podCreationTimestamp="2026-04-18 02:48:13 +0000 UTC" firstStartedPulling="2026-04-18 02:48:13.608659593 +0000 UTC m=+136.748510905" lastFinishedPulling="2026-04-18 02:48:15.512208663 +0000 UTC m=+138.652059989" observedRunningTime="2026-04-18 02:48:15.732639745 +0000 UTC m=+138.872491080" watchObservedRunningTime="2026-04-18 02:48:15.73372795 +0000 UTC m=+138.873579284" Apr 18 02:48:16.847684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:16.847651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:16.847690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:16.847723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:16.847802 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:20.847783389 +0000 UTC m=+143.987634716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:16.847803 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:16.847841 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:16.847880 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:20.847863031 +0000 UTC m=+143.987714349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:16.848044 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:16.847898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:20.847888633 +0000 UTC m=+143.987739951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:19.180239 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:19.180211 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qvkw2_525e6e89-8bf5-472a-bde7-bfb1254515af/dns-node-resolver/0.log" Apr 18 02:48:19.980236 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:19.980207 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cn84g_d48f7502-3de3-4ca9-92d5-5eaf5e999c97/node-ca/0.log" Apr 18 02:48:20.873567 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:20.873529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:20.873580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:20.873662 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:20.873682 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:28.873665118 +0000 UTC m=+152.013516431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:20.873710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:20.873757 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:28.873749722 +0000 UTC m=+152.013601034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:20.873832 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:20.873964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:20.873888 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:28.873872876 +0000 UTC m=+152.013724193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:21.126626 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.126541 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc"] Apr 18 02:48:21.133991 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.133964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" Apr 18 02:48:21.135747 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.135724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc"] Apr 18 02:48:21.136312 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.136293 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:21.136581 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.136544 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 18 02:48:21.136686 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.136596 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-q5q5w\"" Apr 18 02:48:21.176072 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.176050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x7xt\" (UniqueName: \"kubernetes.io/projected/581ba2b4-298f-4936-b05d-4ef4efbe33ad-kube-api-access-9x7xt\") pod \"volume-data-source-validator-7c6cbb6c87-nprbc\" (UID: \"581ba2b4-298f-4936-b05d-4ef4efbe33ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" Apr 18 02:48:21.230875 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.230851 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q"] Apr 18 02:48:21.234900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.234887 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.237396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.237372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 18 02:48:21.237396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.237393 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:21.237508 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.237455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 18 02:48:21.237579 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.237519 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4h275\"" Apr 18 02:48:21.243077 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.243059 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q"] Apr 18 02:48:21.277110 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.277089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x7xt\" (UniqueName: \"kubernetes.io/projected/581ba2b4-298f-4936-b05d-4ef4efbe33ad-kube-api-access-9x7xt\") pod \"volume-data-source-validator-7c6cbb6c87-nprbc\" (UID: \"581ba2b4-298f-4936-b05d-4ef4efbe33ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" Apr 18 02:48:21.277208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.277134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csvs\" (UniqueName: \"kubernetes.io/projected/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-kube-api-access-6csvs\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.277208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.277193 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.284900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.284880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x7xt\" (UniqueName: \"kubernetes.io/projected/581ba2b4-298f-4936-b05d-4ef4efbe33ad-kube-api-access-9x7xt\") pod \"volume-data-source-validator-7c6cbb6c87-nprbc\" (UID: \"581ba2b4-298f-4936-b05d-4ef4efbe33ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" Apr 18 02:48:21.377734 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.377680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6csvs\" (UniqueName: \"kubernetes.io/projected/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-kube-api-access-6csvs\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.377819 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.377742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.377860 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:21.377829 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 18 02:48:21.377893 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:21.377885 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls podName:bb7af832-90b6-4f2c-8b7b-84d9559e4ea8 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:21.877867324 +0000 UTC m=+145.017718638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c4r9q" (UID: "bb7af832-90b6-4f2c-8b7b-84d9559e4ea8") : secret "samples-operator-tls" not found Apr 18 02:48:21.385250 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.385226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csvs\" (UniqueName: \"kubernetes.io/projected/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-kube-api-access-6csvs\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.444330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.444310 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" Apr 18 02:48:21.553684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.553654 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc"] Apr 18 02:48:21.731668 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.731638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" event={"ID":"581ba2b4-298f-4936-b05d-4ef4efbe33ad","Type":"ContainerStarted","Data":"6f5b354ad5c50735c5bb81a0897c7d5188071a103cb7b1f9ddf9a9ab64a090eb"} Apr 18 02:48:21.882323 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:21.882290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:21.882730 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:21.882438 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 18 02:48:21.882730 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:21.882508 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls podName:bb7af832-90b6-4f2c-8b7b-84d9559e4ea8 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:22.882491558 +0000 UTC m=+146.022342872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c4r9q" (UID: "bb7af832-90b6-4f2c-8b7b-84d9559e4ea8") : secret "samples-operator-tls" not found Apr 18 02:48:22.130479 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.130411 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sq72w"] Apr 18 02:48:22.134015 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.133992 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.136467 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.136443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 18 02:48:22.136588 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.136448 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 18 02:48:22.136800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.136784 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 18 02:48:22.136894 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.136819 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:22.137658 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.137642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hg66t\"" Apr 18 02:48:22.142717 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.142634 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 18 02:48:22.143777 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.143734 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sq72w"] Apr 18 02:48:22.185162 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.185132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-config\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.185287 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.185264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7ds7\" (UniqueName: \"kubernetes.io/projected/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-kube-api-access-p7ds7\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.185359 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.185342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-serving-cert\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.185408 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.185398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-trusted-ca\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.286711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.286681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-serving-cert\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.286862 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.286740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-trusted-ca\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.287179 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.286983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-config\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.287179 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.287091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7ds7\" (UniqueName: \"kubernetes.io/projected/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-kube-api-access-p7ds7\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.287628 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.287609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-trusted-ca\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.288163 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.288144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-config\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.289481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.289462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-serving-cert\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.294970 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.294929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7ds7\" (UniqueName: \"kubernetes.io/projected/34bc19f4-315e-4021-baf3-35c6d4a2d5d8-kube-api-access-p7ds7\") pod \"console-operator-9d4b6777b-sq72w\" (UID: \"34bc19f4-315e-4021-baf3-35c6d4a2d5d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.445961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.445888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:22.832012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.831989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sq72w"] Apr 18 02:48:22.835142 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:22.835116 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34bc19f4_315e_4021_baf3_35c6d4a2d5d8.slice/crio-f35bc45d9b31d8913f2e090e86f325fca03ee501c620eff5e8fba338d42a7b8a WatchSource:0}: Error finding container f35bc45d9b31d8913f2e090e86f325fca03ee501c620eff5e8fba338d42a7b8a: Status 404 returned error can't find the container with id f35bc45d9b31d8913f2e090e86f325fca03ee501c620eff5e8fba338d42a7b8a Apr 18 02:48:22.894439 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:22.894415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:22.894750 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:22.894566 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 18 02:48:22.894750 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:22.894628 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls podName:bb7af832-90b6-4f2c-8b7b-84d9559e4ea8 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:24.894613875 +0000 UTC m=+148.034465187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c4r9q" (UID: "bb7af832-90b6-4f2c-8b7b-84d9559e4ea8") : secret "samples-operator-tls" not found Apr 18 02:48:23.736578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:23.736524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" event={"ID":"581ba2b4-298f-4936-b05d-4ef4efbe33ad","Type":"ContainerStarted","Data":"d843192c1d170d3de36233d506afefa3342a4ca03e0cad8fee7e12c5ccb6a647"} Apr 18 02:48:23.737503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:23.737476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" event={"ID":"34bc19f4-315e-4021-baf3-35c6d4a2d5d8","Type":"ContainerStarted","Data":"f35bc45d9b31d8913f2e090e86f325fca03ee501c620eff5e8fba338d42a7b8a"} Apr 18 02:48:23.750480 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:23.750442 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nprbc" podStartSLOduration=1.531044386 podStartE2EDuration="2.750430477s" podCreationTimestamp="2026-04-18 02:48:21 +0000 UTC" firstStartedPulling="2026-04-18 02:48:21.559925262 +0000 UTC m=+144.699776577" lastFinishedPulling="2026-04-18 02:48:22.779311333 +0000 UTC m=+145.919162668" observedRunningTime="2026-04-18 02:48:23.749593667 +0000 UTC m=+146.889445004" watchObservedRunningTime="2026-04-18 02:48:23.750430477 +0000 UTC m=+146.890281811" Apr 18 02:48:24.740510 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:24.740489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/0.log" Apr 18 02:48:24.740820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:24.740526 2577 generic.go:358] "Generic (PLEG): container finished" podID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" containerID="d59cdc0a370b6201337a547886c2dbd82d7486ccfb963a9c1b3c9694553fa579" exitCode=255 Apr 18 02:48:24.740820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:24.740571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" event={"ID":"34bc19f4-315e-4021-baf3-35c6d4a2d5d8","Type":"ContainerDied","Data":"d59cdc0a370b6201337a547886c2dbd82d7486ccfb963a9c1b3c9694553fa579"} Apr 18 02:48:24.740820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:24.740784 2577 scope.go:117] "RemoveContainer" containerID="d59cdc0a370b6201337a547886c2dbd82d7486ccfb963a9c1b3c9694553fa579" Apr 18 02:48:24.913907 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:24.913883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:24.914023 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:24.914005 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 18 02:48:24.914073 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:24.914064 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls podName:bb7af832-90b6-4f2c-8b7b-84d9559e4ea8 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:28.91404954 +0000 UTC m=+152.053900853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c4r9q" (UID: "bb7af832-90b6-4f2c-8b7b-84d9559e4ea8") : secret "samples-operator-tls" not found Apr 18 02:48:25.744602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.744575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/1.log" Apr 18 02:48:25.744954 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.744892 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/0.log" Apr 18 02:48:25.744954 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.744926 2577 generic.go:358] "Generic (PLEG): container finished" podID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" exitCode=255 Apr 18 02:48:25.745037 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.744956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" event={"ID":"34bc19f4-315e-4021-baf3-35c6d4a2d5d8","Type":"ContainerDied","Data":"4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca"} Apr 18 02:48:25.745037 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.744987 2577 scope.go:117] "RemoveContainer" containerID="d59cdc0a370b6201337a547886c2dbd82d7486ccfb963a9c1b3c9694553fa579" Apr 18 02:48:25.745229 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:25.745205 2577 scope.go:117] "RemoveContainer" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" Apr 18 02:48:25.745393 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:25.745369 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sq72w_openshift-console-operator(34bc19f4-315e-4021-baf3-35c6d4a2d5d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podUID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" Apr 18 02:48:26.748379 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:26.748353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/1.log" Apr 18 02:48:26.748735 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:26.748688 2577 scope.go:117] "RemoveContainer" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" Apr 18 02:48:26.748864 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:26.748847 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sq72w_openshift-console-operator(34bc19f4-315e-4021-baf3-35c6d4a2d5d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podUID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" Apr 18 02:48:28.943476 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:28.943445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:28.943592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:28.943625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:28.943651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943621 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943683 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943743 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:44.943722305 +0000 UTC m=+168.083573617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : secret "router-metrics-certs-default" not found Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943748 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943766 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle podName:804c2955-592f-4663-b301-f7f6e6d14909 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:44.943754561 +0000 UTC m=+168.083605877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle") pod "router-default-87df6b698-c7764" (UID: "804c2955-592f-4663-b301-f7f6e6d14909") : configmap references non-existent config key: service-ca.crt Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943779 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls podName:bb7af832-90b6-4f2c-8b7b-84d9559e4ea8 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:36.943773853 +0000 UTC m=+160.083625165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-c4r9q" (UID: "bb7af832-90b6-4f2c-8b7b-84d9559e4ea8") : secret "samples-operator-tls" not found Apr 18 02:48:28.943931 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:28.943789 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls podName:d50fae41-d0e5-4ab2-8306-73f4036be5d1 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:44.943784133 +0000 UTC m=+168.083635446 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-98wv9" (UID: "d50fae41-d0e5-4ab2-8306-73f4036be5d1") : secret "cluster-monitoring-operator-tls" not found Apr 18 02:48:29.151616 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.151593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-65dxd"] Apr 18 02:48:29.155664 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.155649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.158291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.158274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 18 02:48:29.158497 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.158483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 18 02:48:29.158568 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.158484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 18 02:48:29.158613 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.158579 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 18 02:48:29.159600 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.159585 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vbrzg\"" Apr 18 02:48:29.163913 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.163893 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-65dxd"] Apr 18 02:48:29.246363 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.246343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-key\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.246445 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.246373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pspml\" (UniqueName: \"kubernetes.io/projected/ec9684d6-e362-40d5-8fdd-149ef206b77f-kube-api-access-pspml\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.246445 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.246404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-cabundle\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.347542 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.347509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-key\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.347662 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.347575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pspml\" (UniqueName: \"kubernetes.io/projected/ec9684d6-e362-40d5-8fdd-149ef206b77f-kube-api-access-pspml\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.347662 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.347624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-cabundle\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.348193 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.348176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-cabundle\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.349737 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.349712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec9684d6-e362-40d5-8fdd-149ef206b77f-signing-key\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.355564 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.355529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pspml\" (UniqueName: \"kubernetes.io/projected/ec9684d6-e362-40d5-8fdd-149ef206b77f-kube-api-access-pspml\") pod \"service-ca-865cb79987-65dxd\" (UID: \"ec9684d6-e362-40d5-8fdd-149ef206b77f\") " pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.465015 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.464993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-65dxd" Apr 18 02:48:29.574158 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.574132 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-65dxd"] Apr 18 02:48:29.576722 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:29.576689 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9684d6_e362_40d5_8fdd_149ef206b77f.slice/crio-05c0d29f6f1c97a57ba83a2788dc472d5f0bce5088ac63db70fced4751dcd3f8 WatchSource:0}: Error finding container 05c0d29f6f1c97a57ba83a2788dc472d5f0bce5088ac63db70fced4751dcd3f8: Status 404 returned error can't find the container with id 05c0d29f6f1c97a57ba83a2788dc472d5f0bce5088ac63db70fced4751dcd3f8 Apr 18 02:48:29.756032 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:29.756005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-65dxd" event={"ID":"ec9684d6-e362-40d5-8fdd-149ef206b77f","Type":"ContainerStarted","Data":"05c0d29f6f1c97a57ba83a2788dc472d5f0bce5088ac63db70fced4751dcd3f8"} Apr 18 02:48:31.768343 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:31.768303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-65dxd" event={"ID":"ec9684d6-e362-40d5-8fdd-149ef206b77f","Type":"ContainerStarted","Data":"384daf2ab38a849cdcc181636d52612be3f6445d6413b07989615a5f63b83742"} Apr 18 02:48:31.784996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:31.784956 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-65dxd" podStartSLOduration=1.22843871 podStartE2EDuration="2.784943437s" podCreationTimestamp="2026-04-18 02:48:29 +0000 UTC" firstStartedPulling="2026-04-18 02:48:29.578514012 +0000 UTC m=+152.718365325" lastFinishedPulling="2026-04-18 02:48:31.135018735 +0000 UTC m=+154.274870052" observedRunningTime="2026-04-18 02:48:31.784351813 +0000 UTC m=+154.924203149" watchObservedRunningTime="2026-04-18 02:48:31.784943437 +0000 UTC m=+154.924794771" Apr 18 02:48:32.446405 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:32.446369 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:32.446405 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:32.446403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:32.446811 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:32.446794 2577 scope.go:117] "RemoveContainer" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" Apr 18 02:48:32.446974 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:32.446958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sq72w_openshift-console-operator(34bc19f4-315e-4021-baf3-35c6d4a2d5d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podUID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" Apr 18 02:48:32.779605 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:32.779568 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" Apr 18 02:48:32.780613 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:32.780594 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9l4pp" podUID="87aa1c86-8143-4fdc-b899-17184a387dcf" Apr 18 02:48:32.815743 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:32.815714 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ms6tq" podUID="17f79784-a585-4ce4-ae11-e420d136c2d0" Apr 18 02:48:33.374568 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:33.374510 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6xc88" podUID="64806518-b360-4104-92e5-8a3017ab382a" Apr 18 02:48:33.773705 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:33.773679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:33.773895 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:33.773684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:37.018247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.018211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:37.020671 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.020648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb7af832-90b6-4f2c-8b7b-84d9559e4ea8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-c4r9q\" (UID: \"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:37.143823 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.143797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" Apr 18 02:48:37.260510 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.260473 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q"] Apr 18 02:48:37.724489 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.724370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:48:37.724489 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.724470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:37.724744 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.724608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:37.726978 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.726947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"image-registry-7f9c87579f-2s7d5\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:37.727289 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.727272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87aa1c86-8143-4fdc-b899-17184a387dcf-metrics-tls\") pod \"dns-default-9l4pp\" (UID: \"87aa1c86-8143-4fdc-b899-17184a387dcf\") " pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:37.727331 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.727298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f79784-a585-4ce4-ae11-e420d136c2d0-cert\") pod \"ingress-canary-ms6tq\" (UID: \"17f79784-a585-4ce4-ae11-e420d136c2d0\") " pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:48:37.784053 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.784016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" event={"ID":"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8","Type":"ContainerStarted","Data":"8011189a64f3cc694d21025df582953b49e7d37b61749d209b74cf9e1fa05957"} Apr 18 02:48:37.976976 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.976883 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:48:37.976976 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.976933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxh9b\"" Apr 18 02:48:37.985182 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.985160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:37.985308 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:37.985275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:38.125089 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.125065 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:48:38.127642 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:38.127613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f88955_bfd0_4343_b6c6_2a18dd0d1149.slice/crio-6b049e3894252bb2990c51d5021e309206af36f12c6bdc0a80b76af2558160ce WatchSource:0}: Error finding container 6b049e3894252bb2990c51d5021e309206af36f12c6bdc0a80b76af2558160ce: Status 404 returned error can't find the container with id 6b049e3894252bb2990c51d5021e309206af36f12c6bdc0a80b76af2558160ce Apr 18 02:48:38.141850 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.141826 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9l4pp"] Apr 18 02:48:38.144651 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:38.144614 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87aa1c86_8143_4fdc_b899_17184a387dcf.slice/crio-f891169f6e26c2ff3be39d7a351c9d12bc67c810d348efd344c4cf3841e5db34 WatchSource:0}: Error finding container f891169f6e26c2ff3be39d7a351c9d12bc67c810d348efd344c4cf3841e5db34: Status 404 returned error can't find the container with id f891169f6e26c2ff3be39d7a351c9d12bc67c810d348efd344c4cf3841e5db34 Apr 18 02:48:38.788101 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.788067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l4pp" event={"ID":"87aa1c86-8143-4fdc-b899-17184a387dcf","Type":"ContainerStarted","Data":"f891169f6e26c2ff3be39d7a351c9d12bc67c810d348efd344c4cf3841e5db34"} Apr 18 02:48:38.790178 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.790146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" event={"ID":"f5f88955-bfd0-4343-b6c6-2a18dd0d1149","Type":"ContainerStarted","Data":"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2"} Apr 18 02:48:38.790318 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.790182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" event={"ID":"f5f88955-bfd0-4343-b6c6-2a18dd0d1149","Type":"ContainerStarted","Data":"6b049e3894252bb2990c51d5021e309206af36f12c6bdc0a80b76af2558160ce"} Apr 18 02:48:38.790383 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.790347 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:38.809355 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:38.808921 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" podStartSLOduration=174.808903041 podStartE2EDuration="2m54.808903041s" podCreationTimestamp="2026-04-18 02:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:48:38.807572119 +0000 UTC m=+161.947423455" watchObservedRunningTime="2026-04-18 02:48:38.808903041 +0000 UTC m=+161.948754377" Apr 18 02:48:39.794942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:39.794904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" event={"ID":"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8","Type":"ContainerStarted","Data":"f7e207299e6f461e2a2ad2840815aad08caf4cef5fdf930c517e2b2f3c7b0867"} Apr 18 02:48:39.795407 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:39.794952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" event={"ID":"bb7af832-90b6-4f2c-8b7b-84d9559e4ea8","Type":"ContainerStarted","Data":"30e6d563524ec6380cfdd29a0c1ef56355929b62fc4873023e2e3d3e15f6495c"} Apr 18 02:48:39.817602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:39.817538 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-c4r9q" podStartSLOduration=17.05942604 podStartE2EDuration="18.817522884s" podCreationTimestamp="2026-04-18 02:48:21 +0000 UTC" firstStartedPulling="2026-04-18 02:48:37.299293668 +0000 UTC m=+160.439144982" lastFinishedPulling="2026-04-18 02:48:39.057390499 +0000 UTC m=+162.197241826" observedRunningTime="2026-04-18 02:48:39.816850823 +0000 UTC m=+162.956702160" watchObservedRunningTime="2026-04-18 02:48:39.817522884 +0000 UTC m=+162.957374219" Apr 18 02:48:40.798218 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:40.798181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l4pp" event={"ID":"87aa1c86-8143-4fdc-b899-17184a387dcf","Type":"ContainerStarted","Data":"38a92957bd1c277ebc344c8178fba1df56037a5ec5d76b13022b77df3a9015c1"} Apr 18 02:48:40.798218 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:40.798218 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l4pp" event={"ID":"87aa1c86-8143-4fdc-b899-17184a387dcf","Type":"ContainerStarted","Data":"b0624730147d8095c45ec475f9ede9a09afa0eedeaf1c53a56b3c1677e45d030"} Apr 18 02:48:40.815224 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:40.815178 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9l4pp" podStartSLOduration=130.109346156 podStartE2EDuration="2m11.815165408s" podCreationTimestamp="2026-04-18 02:46:29 +0000 UTC" firstStartedPulling="2026-04-18 02:48:38.146909886 +0000 UTC m=+161.286761207" lastFinishedPulling="2026-04-18 02:48:39.852729141 +0000 UTC m=+162.992580459" observedRunningTime="2026-04-18 02:48:40.814574234 +0000 UTC m=+163.954425565" watchObservedRunningTime="2026-04-18 02:48:40.815165408 +0000 UTC m=+163.955016737" Apr 18 02:48:41.801465 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:41.801435 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:43.358598 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.358544 2577 scope.go:117] "RemoveContainer" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" Apr 18 02:48:43.807964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.807937 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:48:43.808327 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.808309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/1.log" Apr 18 02:48:43.808382 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.808343 2577 generic.go:358] "Generic (PLEG): container finished" podID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" containerID="35d4da3345bbca79a9f28865d367a8a93166dbcf1dfd1aef53e81df9abb2a346" exitCode=255 Apr 18 02:48:43.808382 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.808375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" event={"ID":"34bc19f4-315e-4021-baf3-35c6d4a2d5d8","Type":"ContainerDied","Data":"35d4da3345bbca79a9f28865d367a8a93166dbcf1dfd1aef53e81df9abb2a346"} Apr 18 02:48:43.808444 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.808401 2577 scope.go:117] "RemoveContainer" containerID="4685ae481a5a7d003f1dd8e608f9f39da31bb4ef153ddf7aed4f56e4d6ce1eca" Apr 18 02:48:43.808826 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:43.808800 2577 scope.go:117] "RemoveContainer" containerID="35d4da3345bbca79a9f28865d367a8a93166dbcf1dfd1aef53e81df9abb2a346" Apr 18 02:48:43.809050 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:43.809028 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-sq72w_openshift-console-operator(34bc19f4-315e-4021-baf3-35c6d4a2d5d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podUID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" Apr 18 02:48:44.358370 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.358285 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:48:44.812478 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.812451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:48:44.991001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.990964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:44.991183 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.991010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:44.991183 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.991048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:44.991588 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.991566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804c2955-592f-4663-b301-f7f6e6d14909-service-ca-bundle\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:44.993429 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.993410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d50fae41-d0e5-4ab2-8306-73f4036be5d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-98wv9\" (UID: \"d50fae41-d0e5-4ab2-8306-73f4036be5d1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:44.993515 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:44.993456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804c2955-592f-4663-b301-f7f6e6d14909-metrics-certs\") pod \"router-default-87df6b698-c7764\" (UID: \"804c2955-592f-4663-b301-f7f6e6d14909\") " pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:45.193718 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.193636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" Apr 18 02:48:45.198413 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.198392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:45.314641 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.314559 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9"] Apr 18 02:48:45.317529 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:45.317503 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50fae41_d0e5_4ab2_8306_73f4036be5d1.slice/crio-2e391673a6c910ebe5a5682238f81be8f4158af9a44ce7dcb470bee7e1e4e5b9 WatchSource:0}: Error finding container 2e391673a6c910ebe5a5682238f81be8f4158af9a44ce7dcb470bee7e1e4e5b9: Status 404 returned error can't find the container with id 2e391673a6c910ebe5a5682238f81be8f4158af9a44ce7dcb470bee7e1e4e5b9 Apr 18 02:48:45.332043 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.332020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-87df6b698-c7764"] Apr 18 02:48:45.334793 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:45.334767 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804c2955_592f_4663_b301_f7f6e6d14909.slice/crio-dbe2e896722ae882fc5e8867c2a7aa4b82d1b8a039d71cbbf96d81273d0cc1a9 WatchSource:0}: Error finding container dbe2e896722ae882fc5e8867c2a7aa4b82d1b8a039d71cbbf96d81273d0cc1a9: Status 404 returned error can't find the container with id dbe2e896722ae882fc5e8867c2a7aa4b82d1b8a039d71cbbf96d81273d0cc1a9 Apr 18 02:48:45.358792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.358770 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:48:45.361345 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.361327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:48:45.369206 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.369187 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ms6tq" Apr 18 02:48:45.485658 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.485629 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ms6tq"] Apr 18 02:48:45.488985 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:45.488958 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f79784_a585_4ce4_ae11_e420d136c2d0.slice/crio-143fc5e1aa52a08daf9bab36577f14e729a409015e24d8998110912bd329a906 WatchSource:0}: Error finding container 143fc5e1aa52a08daf9bab36577f14e729a409015e24d8998110912bd329a906: Status 404 returned error can't find the container with id 143fc5e1aa52a08daf9bab36577f14e729a409015e24d8998110912bd329a906 Apr 18 02:48:45.816968 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.816930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-87df6b698-c7764" event={"ID":"804c2955-592f-4663-b301-f7f6e6d14909","Type":"ContainerStarted","Data":"c629eb743a9232171d009be658f9a165f1b8d58e1f45bc7fbbd31cab1f731257"} Apr 18 02:48:45.816968 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.816974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-87df6b698-c7764" event={"ID":"804c2955-592f-4663-b301-f7f6e6d14909","Type":"ContainerStarted","Data":"dbe2e896722ae882fc5e8867c2a7aa4b82d1b8a039d71cbbf96d81273d0cc1a9"} Apr 18 02:48:45.818022 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.818000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ms6tq" event={"ID":"17f79784-a585-4ce4-ae11-e420d136c2d0","Type":"ContainerStarted","Data":"143fc5e1aa52a08daf9bab36577f14e729a409015e24d8998110912bd329a906"} Apr 18 02:48:45.818886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.818868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" event={"ID":"d50fae41-d0e5-4ab2-8306-73f4036be5d1","Type":"ContainerStarted","Data":"2e391673a6c910ebe5a5682238f81be8f4158af9a44ce7dcb470bee7e1e4e5b9"} Apr 18 02:48:45.833654 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:45.833612 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-87df6b698-c7764" podStartSLOduration=32.833598606 podStartE2EDuration="32.833598606s" podCreationTimestamp="2026-04-18 02:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:48:45.833006588 +0000 UTC m=+168.972857917" watchObservedRunningTime="2026-04-18 02:48:45.833598606 +0000 UTC m=+168.973449941" Apr 18 02:48:46.198675 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:46.198593 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:46.201672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:46.201648 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:46.821632 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:46.821594 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:46.822924 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:46.822900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-87df6b698-c7764" Apr 18 02:48:47.825380 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:47.825296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ms6tq" event={"ID":"17f79784-a585-4ce4-ae11-e420d136c2d0","Type":"ContainerStarted","Data":"1520dcd72e883aab633897d9120638af871f4411ba118a52e971d9ca99320a95"} Apr 18 02:48:47.826750 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:47.826724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" event={"ID":"d50fae41-d0e5-4ab2-8306-73f4036be5d1","Type":"ContainerStarted","Data":"4962c0e7c199c57f3055aba62becb6d58ad103a05df4dd8d545b5aff4a9eb254"} Apr 18 02:48:47.844456 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:47.844405 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ms6tq" podStartSLOduration=136.767295522 podStartE2EDuration="2m18.844390486s" podCreationTimestamp="2026-04-18 02:46:29 +0000 UTC" firstStartedPulling="2026-04-18 02:48:45.490779693 +0000 UTC m=+168.630631006" lastFinishedPulling="2026-04-18 02:48:47.567874642 +0000 UTC m=+170.707725970" observedRunningTime="2026-04-18 02:48:47.843840776 +0000 UTC m=+170.983692112" watchObservedRunningTime="2026-04-18 02:48:47.844390486 +0000 UTC m=+170.984241821" Apr 18 02:48:47.861006 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:47.860961 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-98wv9" podStartSLOduration=32.616459275 podStartE2EDuration="34.860935938s" podCreationTimestamp="2026-04-18 02:48:13 +0000 UTC" firstStartedPulling="2026-04-18 02:48:45.319277798 +0000 UTC m=+168.459129112" lastFinishedPulling="2026-04-18 02:48:47.563754461 +0000 UTC m=+170.703605775" observedRunningTime="2026-04-18 02:48:47.860418515 +0000 UTC m=+171.000269862" watchObservedRunningTime="2026-04-18 02:48:47.860935938 +0000 UTC m=+171.000787273" Apr 18 02:48:51.806410 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:51.806379 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9l4pp" Apr 18 02:48:52.365247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.365217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5hpwv"] Apr 18 02:48:52.368610 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.368586 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.372084 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.372059 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 18 02:48:52.372208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.372061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d27jw\"" Apr 18 02:48:52.372258 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.372072 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 18 02:48:52.377186 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.377165 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5hpwv"] Apr 18 02:48:52.446621 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.446599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:52.446726 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.446635 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:48:52.446997 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.446976 2577 scope.go:117] "RemoveContainer" containerID="35d4da3345bbca79a9f28865d367a8a93166dbcf1dfd1aef53e81df9abb2a346" Apr 18 02:48:52.447158 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:52.447143 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-sq72w_openshift-console-operator(34bc19f4-315e-4021-baf3-35c6d4a2d5d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podUID="34bc19f4-315e-4021-baf3-35c6d4a2d5d8" Apr 18 02:48:52.485115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.485089 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq"] Apr 18 02:48:52.487935 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.487917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:52.490398 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.490376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9vswb\"" Apr 18 02:48:52.490788 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.490771 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 18 02:48:52.497822 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.497799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq"] Apr 18 02:48:52.555468 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.555443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/483d06b1-71a6-40dd-b533-c804d4e50b45-crio-socket\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.555602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.555480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/483d06b1-71a6-40dd-b533-c804d4e50b45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.555602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.555509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/483d06b1-71a6-40dd-b533-c804d4e50b45-data-volume\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.555679 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.555599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.555731 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.555706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2x8h\" (UniqueName: \"kubernetes.io/projected/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-api-access-w2x8h\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/483d06b1-71a6-40dd-b533-c804d4e50b45-crio-socket\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/483d06b1-71a6-40dd-b533-c804d4e50b45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/483d06b1-71a6-40dd-b533-c804d4e50b45-data-volume\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656779 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/483d06b1-71a6-40dd-b533-c804d4e50b45-crio-socket\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656779 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656779 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2x8h\" (UniqueName: \"kubernetes.io/projected/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-api-access-w2x8h\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.656779 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c5d30379-95e3-4c98-b645-bda215d9fed8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f89tq\" (UID: \"c5d30379-95e3-4c98-b645-bda215d9fed8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:52.656901 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.656842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/483d06b1-71a6-40dd-b533-c804d4e50b45-data-volume\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.657102 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.657078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.658710 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.658690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/483d06b1-71a6-40dd-b533-c804d4e50b45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.664521 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.664502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2x8h\" (UniqueName: \"kubernetes.io/projected/483d06b1-71a6-40dd-b533-c804d4e50b45-kube-api-access-w2x8h\") pod \"insights-runtime-extractor-5hpwv\" (UID: \"483d06b1-71a6-40dd-b533-c804d4e50b45\") " pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.677803 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.677784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5hpwv" Apr 18 02:48:52.758053 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.758007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c5d30379-95e3-4c98-b645-bda215d9fed8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f89tq\" (UID: \"c5d30379-95e3-4c98-b645-bda215d9fed8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:52.760180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.760157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c5d30379-95e3-4c98-b645-bda215d9fed8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f89tq\" (UID: \"c5d30379-95e3-4c98-b645-bda215d9fed8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:52.792007 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.791984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5hpwv"] Apr 18 02:48:52.795183 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:52.795152 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod483d06b1_71a6_40dd_b533_c804d4e50b45.slice/crio-82e008b564588d8e1f6dfd2d1d4e7205db053b248a6f89aff2c3e972c1c2ab88 WatchSource:0}: Error finding container 82e008b564588d8e1f6dfd2d1d4e7205db053b248a6f89aff2c3e972c1c2ab88: Status 404 returned error can't find the container with id 82e008b564588d8e1f6dfd2d1d4e7205db053b248a6f89aff2c3e972c1c2ab88 Apr 18 02:48:52.796228 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.796212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:52.840780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.840749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5hpwv" event={"ID":"483d06b1-71a6-40dd-b533-c804d4e50b45","Type":"ContainerStarted","Data":"82e008b564588d8e1f6dfd2d1d4e7205db053b248a6f89aff2c3e972c1c2ab88"} Apr 18 02:48:52.914514 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:52.914449 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq"] Apr 18 02:48:52.917434 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:52.917405 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d30379_95e3_4c98_b645_bda215d9fed8.slice/crio-dc3e42e0b7b7192b5a121ffc95b8e35bc30e818b30149b7b2dea364eed3c0ed4 WatchSource:0}: Error finding container dc3e42e0b7b7192b5a121ffc95b8e35bc30e818b30149b7b2dea364eed3c0ed4: Status 404 returned error can't find the container with id dc3e42e0b7b7192b5a121ffc95b8e35bc30e818b30149b7b2dea364eed3c0ed4 Apr 18 02:48:53.847182 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:53.847156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5hpwv" event={"ID":"483d06b1-71a6-40dd-b533-c804d4e50b45","Type":"ContainerStarted","Data":"4dad9e07f11067b08d0a9d4e38b9a58e563af2185f3f9eefc2feccd6174a1d0a"} Apr 18 02:48:53.847510 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:53.847193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5hpwv" event={"ID":"483d06b1-71a6-40dd-b533-c804d4e50b45","Type":"ContainerStarted","Data":"0f540bb4908a8ece719ae54b2e9aa6541807a603bd985f199a1333fb83f672d4"} Apr 18 02:48:53.848150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:53.848130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" event={"ID":"c5d30379-95e3-4c98-b645-bda215d9fed8","Type":"ContainerStarted","Data":"dc3e42e0b7b7192b5a121ffc95b8e35bc30e818b30149b7b2dea364eed3c0ed4"} Apr 18 02:48:54.852055 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.852031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" event={"ID":"c5d30379-95e3-4c98-b645-bda215d9fed8","Type":"ContainerStarted","Data":"f5bcbc0335046e3248c8dfb0bd80c00cc7c8a4b156a20c21b3e6b39c6936c7c0"} Apr 18 02:48:54.852384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.852187 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:54.854130 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.854085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5hpwv" event={"ID":"483d06b1-71a6-40dd-b533-c804d4e50b45","Type":"ContainerStarted","Data":"7d440dbe23f54d4407777a4d654dbbea595a6a0f9253a37e57a59692d4b1ff85"} Apr 18 02:48:54.856824 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.856808 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" Apr 18 02:48:54.866345 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.866309 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f89tq" podStartSLOduration=1.943036778 podStartE2EDuration="2.866297212s" podCreationTimestamp="2026-04-18 02:48:52 +0000 UTC" firstStartedPulling="2026-04-18 02:48:52.91939133 +0000 UTC m=+176.059242656" lastFinishedPulling="2026-04-18 02:48:53.842651778 +0000 UTC m=+176.982503090" observedRunningTime="2026-04-18 02:48:54.864800285 +0000 UTC m=+178.004651620" watchObservedRunningTime="2026-04-18 02:48:54.866297212 +0000 UTC m=+178.006148525" Apr 18 02:48:54.895693 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:54.895648 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5hpwv" podStartSLOduration=1.002517558 podStartE2EDuration="2.895636752s" podCreationTimestamp="2026-04-18 02:48:52 +0000 UTC" firstStartedPulling="2026-04-18 02:48:52.856654774 +0000 UTC m=+175.996506090" lastFinishedPulling="2026-04-18 02:48:54.749773971 +0000 UTC m=+177.889625284" observedRunningTime="2026-04-18 02:48:54.893734755 +0000 UTC m=+178.033586087" watchObservedRunningTime="2026-04-18 02:48:54.895636752 +0000 UTC m=+178.035488089" Apr 18 02:48:55.113681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.113597 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-crfwl"] Apr 18 02:48:55.116885 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.116862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.119325 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.119303 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 18 02:48:55.119432 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.119341 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 18 02:48:55.119432 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.119408 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 18 02:48:55.119544 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.119528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-hwlmc\"" Apr 18 02:48:55.126057 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.126036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-crfwl"] Apr 18 02:48:55.280841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.280805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.280841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.280844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.281049 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.280906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.281049 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.280923 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxq7d\" (UniqueName: \"kubernetes.io/projected/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-kube-api-access-pxq7d\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.382192 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.382120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.382192 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.382152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxq7d\" (UniqueName: \"kubernetes.io/projected/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-kube-api-access-pxq7d\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.382192 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.382186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.382403 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.382216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.382746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.382726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.384602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.384579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.384698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.384606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.390124 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.390105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxq7d\" (UniqueName: \"kubernetes.io/projected/0ed4e54a-7df6-451d-81f0-1d54c83a76f1-kube-api-access-pxq7d\") pod \"prometheus-operator-5676c8c784-crfwl\" (UID: \"0ed4e54a-7df6-451d-81f0-1d54c83a76f1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.425930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.425907 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" Apr 18 02:48:55.538968 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.538943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-crfwl"] Apr 18 02:48:55.541419 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:55.541395 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed4e54a_7df6_451d_81f0_1d54c83a76f1.slice/crio-6d0d2fe9249e58638df9f93b4f4577256af0873404f85f0571c398d38a24af60 WatchSource:0}: Error finding container 6d0d2fe9249e58638df9f93b4f4577256af0873404f85f0571c398d38a24af60: Status 404 returned error can't find the container with id 6d0d2fe9249e58638df9f93b4f4577256af0873404f85f0571c398d38a24af60 Apr 18 02:48:55.857480 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:55.857443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" event={"ID":"0ed4e54a-7df6-451d-81f0-1d54c83a76f1","Type":"ContainerStarted","Data":"6d0d2fe9249e58638df9f93b4f4577256af0873404f85f0571c398d38a24af60"} Apr 18 02:48:56.863435 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:56.863404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" event={"ID":"0ed4e54a-7df6-451d-81f0-1d54c83a76f1","Type":"ContainerStarted","Data":"9da679b35c5fe11408efc35d95134d9ea0e82c0cdd17949d3b2f690cb5ac4125"} Apr 18 02:48:56.863856 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:56.863444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" event={"ID":"0ed4e54a-7df6-451d-81f0-1d54c83a76f1","Type":"ContainerStarted","Data":"6fba27c3fb25568dbe81e18e73f7fbb23e60c70ca5509731976f4cb70fcc64c5"} Apr 18 02:48:56.879486 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:56.879440 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-crfwl" podStartSLOduration=0.697458281 podStartE2EDuration="1.879425414s" podCreationTimestamp="2026-04-18 02:48:55 +0000 UTC" firstStartedPulling="2026-04-18 02:48:55.543258701 +0000 UTC m=+178.683110014" lastFinishedPulling="2026-04-18 02:48:56.725225835 +0000 UTC m=+179.865077147" observedRunningTime="2026-04-18 02:48:56.877636386 +0000 UTC m=+180.017487723" watchObservedRunningTime="2026-04-18 02:48:56.879425414 +0000 UTC m=+180.019276787" Apr 18 02:48:57.989448 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:57.989413 2577 patch_prober.go:28] interesting pod/image-registry-7f9c87579f-2s7d5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 18 02:48:57.989820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:57.989477 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 18 02:48:58.470313 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.470242 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cxw6w"] Apr 18 02:48:58.473973 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.473946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.476807 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.476781 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 18 02:48:58.477114 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.477097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pzdgw\"" Apr 18 02:48:58.477302 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.477285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 18 02:48:58.477704 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.477685 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 18 02:48:58.504243 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-metrics-client-ca\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-wtmp\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504457 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-root\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-textfile\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-sys\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.504584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.504531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzs6g\" (UniqueName: \"kubernetes.io/projected/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-kube-api-access-dzs6g\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.605578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.605753 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-metrics-client-ca\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.605753 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:58.605673 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 18 02:48:58.605753 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:48:58.605746 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls podName:9dc12d5a-da8e-40a0-8c4f-d989f722e15e nodeName:}" failed. No retries permitted until 2026-04-18 02:48:59.105724205 +0000 UTC m=+182.245575525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls") pod "node-exporter-cxw6w" (UID: "9dc12d5a-da8e-40a0-8c4f-d989f722e15e") : secret "node-exporter-tls" not found Apr 18 02:48:58.605920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-wtmp\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.605920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.605920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-root\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.605965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-textfile\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-sys\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzs6g\" (UniqueName: \"kubernetes.io/projected/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-kube-api-access-dzs6g\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606245 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-metrics-client-ca\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606293 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-sys\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-textfile\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-root\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606419 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-wtmp\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.606640 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.606616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.608185 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.608165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:58.616906 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:58.616884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzs6g\" (UniqueName: \"kubernetes.io/projected/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-kube-api-access-dzs6g\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:59.110057 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:59.110017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:59.112351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:59.112325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9dc12d5a-da8e-40a0-8c4f-d989f722e15e-node-exporter-tls\") pod \"node-exporter-cxw6w\" (UID: \"9dc12d5a-da8e-40a0-8c4f-d989f722e15e\") " pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:59.386961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:59.386881 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxw6w" Apr 18 02:48:59.396562 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:48:59.396518 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc12d5a_da8e_40a0_8c4f_d989f722e15e.slice/crio-59dc813fb98bc02477067b64170796fe8ae266f99fcc37fc35e9733785300398 WatchSource:0}: Error finding container 59dc813fb98bc02477067b64170796fe8ae266f99fcc37fc35e9733785300398: Status 404 returned error can't find the container with id 59dc813fb98bc02477067b64170796fe8ae266f99fcc37fc35e9733785300398 Apr 18 02:48:59.799498 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:59.799466 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:48:59.872499 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:48:59.872464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxw6w" event={"ID":"9dc12d5a-da8e-40a0-8c4f-d989f722e15e","Type":"ContainerStarted","Data":"59dc813fb98bc02477067b64170796fe8ae266f99fcc37fc35e9733785300398"} Apr 18 02:49:00.876862 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:00.876826 2577 generic.go:358] "Generic (PLEG): container finished" podID="9dc12d5a-da8e-40a0-8c4f-d989f722e15e" containerID="8507bb8065458e30a999a9441c82b5099bec8e46739a1b0eadba3636301a7d72" exitCode=0 Apr 18 02:49:00.877337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:00.876886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxw6w" event={"ID":"9dc12d5a-da8e-40a0-8c4f-d989f722e15e","Type":"ContainerDied","Data":"8507bb8065458e30a999a9441c82b5099bec8e46739a1b0eadba3636301a7d72"} Apr 18 02:49:01.882238 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:01.882201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxw6w" event={"ID":"9dc12d5a-da8e-40a0-8c4f-d989f722e15e","Type":"ContainerStarted","Data":"7724b7c45aea60a5d96e7e6e00e6ab8ab0f85be86ae6f3d149ad59fb7a9b69d3"} Apr 18 02:49:01.882238 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:01.882241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxw6w" event={"ID":"9dc12d5a-da8e-40a0-8c4f-d989f722e15e","Type":"ContainerStarted","Data":"6433a89b348a9390712a8526108920b82e2ff6cc2846d4b15049b034af783132"} Apr 18 02:49:01.900022 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:01.899970 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cxw6w" podStartSLOduration=3.070238392 podStartE2EDuration="3.899955412s" podCreationTimestamp="2026-04-18 02:48:58 +0000 UTC" firstStartedPulling="2026-04-18 02:48:59.398882739 +0000 UTC m=+182.538734066" lastFinishedPulling="2026-04-18 02:49:00.22859976 +0000 UTC m=+183.368451086" observedRunningTime="2026-04-18 02:49:01.898800539 +0000 UTC m=+185.038651889" watchObservedRunningTime="2026-04-18 02:49:01.899955412 +0000 UTC m=+185.039806746" Apr 18 02:49:02.766977 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.766946 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-757669dd5c-9qwfl"] Apr 18 02:49:02.771757 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.771738 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.774341 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.774315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5a4u2meefmobc\"" Apr 18 02:49:02.775449 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.775423 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 18 02:49:02.775588 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.775494 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 18 02:49:02.775690 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.775440 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 18 02:49:02.775799 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.775783 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-rw59l\"" Apr 18 02:49:02.775950 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.775736 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 18 02:49:02.779227 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.779205 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-757669dd5c-9qwfl"] Apr 18 02:49:02.839398 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-tls\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839536 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-client-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839536 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-audit-log\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmxv7\" (UniqueName: \"kubernetes.io/projected/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-kube-api-access-dmxv7\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-metrics-server-audit-profiles\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.839694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.839682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-client-certs\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.940977 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.940939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.940989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmxv7\" (UniqueName: \"kubernetes.io/projected/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-kube-api-access-dmxv7\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-metrics-server-audit-profiles\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-client-certs\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-tls\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-client-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-audit-log\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941844 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.941844 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.941824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-audit-log\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.942251 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.942235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-metrics-server-audit-profiles\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.943524 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.943503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-client-certs\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.944059 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.944041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-secret-metrics-server-tls\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.944139 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.944045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-client-ca-bundle\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:02.950058 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:02.950039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmxv7\" (UniqueName: \"kubernetes.io/projected/6ceb01e6-2764-48f1-8ea7-cfec3b7e935b-kube-api-access-dmxv7\") pod \"metrics-server-757669dd5c-9qwfl\" (UID: \"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b\") " pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:03.083719 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:03.083632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:03.212922 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:03.212891 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-757669dd5c-9qwfl"] Apr 18 02:49:03.215960 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:49:03.215927 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ceb01e6_2764_48f1_8ea7_cfec3b7e935b.slice/crio-5d1f3681602bae328e709165927dd553e78964d1abb8021307ca1644804e3752 WatchSource:0}: Error finding container 5d1f3681602bae328e709165927dd553e78964d1abb8021307ca1644804e3752: Status 404 returned error can't find the container with id 5d1f3681602bae328e709165927dd553e78964d1abb8021307ca1644804e3752 Apr 18 02:49:03.888177 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:03.888133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" event={"ID":"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b","Type":"ContainerStarted","Data":"5d1f3681602bae328e709165927dd553e78964d1abb8021307ca1644804e3752"} Apr 18 02:49:04.893223 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:04.893129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" event={"ID":"6ceb01e6-2764-48f1-8ea7-cfec3b7e935b","Type":"ContainerStarted","Data":"b649ea3aa3a15ba65233a3bc6e05472d90fa1e132c98cd11c173d6086e3a7bae"} Apr 18 02:49:04.909301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:04.909251 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" podStartSLOduration=1.624904172 podStartE2EDuration="2.909235412s" podCreationTimestamp="2026-04-18 02:49:02 +0000 UTC" firstStartedPulling="2026-04-18 02:49:03.218106471 +0000 UTC m=+186.357957784" lastFinishedPulling="2026-04-18 02:49:04.502437697 +0000 UTC m=+187.642289024" observedRunningTime="2026-04-18 02:49:04.908786564 +0000 UTC m=+188.048637907" watchObservedRunningTime="2026-04-18 02:49:04.909235412 +0000 UTC m=+188.049086746" Apr 18 02:49:06.358120 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:06.358085 2577 scope.go:117] "RemoveContainer" containerID="35d4da3345bbca79a9f28865d367a8a93166dbcf1dfd1aef53e81df9abb2a346" Apr 18 02:49:06.900680 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:06.900655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:49:06.900848 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:06.900768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" event={"ID":"34bc19f4-315e-4021-baf3-35c6d4a2d5d8","Type":"ContainerStarted","Data":"6715a2149f0b096d9be02a5f26ea3ccb92125a9a0cf5eb71769e9f95a797a0c9"} Apr 18 02:49:06.901076 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:06.901044 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:49:06.916718 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:06.916673 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" podStartSLOduration=43.158053066 podStartE2EDuration="44.916661062s" podCreationTimestamp="2026-04-18 02:48:22 +0000 UTC" firstStartedPulling="2026-04-18 02:48:22.836917648 +0000 UTC m=+145.976768960" lastFinishedPulling="2026-04-18 02:48:24.595525639 +0000 UTC m=+147.735376956" observedRunningTime="2026-04-18 02:49:06.915437915 +0000 UTC m=+190.055289245" watchObservedRunningTime="2026-04-18 02:49:06.916661062 +0000 UTC m=+190.056512425" Apr 18 02:49:07.043776 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:07.043747 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-sq72w" Apr 18 02:49:14.702087 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:14.702055 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:49:16.749996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.749964 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:49:16.753302 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.753280 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.756910 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.756885 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 18 02:49:16.757020 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.756919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 18 02:49:16.758008 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.757993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wwf9c\"" Apr 18 02:49:16.758074 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.757993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 18 02:49:16.758074 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.758025 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 18 02:49:16.762112 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.762094 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 18 02:49:16.762464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.762448 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 18 02:49:16.762508 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.762461 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 18 02:49:16.772855 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.772828 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:49:16.866631 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.866631 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.866835 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.866835 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.866835 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.866835 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.866802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6tm\" (UniqueName: \"kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968035 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.967996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968035 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6tm\" (UniqueName: \"kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968806 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.968926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.968868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.970521 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.970493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.970638 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.970494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:16.975138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:16.975116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6tm\" (UniqueName: \"kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm\") pod \"console-7f5984ff74-hldrv\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:17.062338 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:17.062248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:17.180710 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:17.180599 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:49:17.183359 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:49:17.183327 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0859c2d2_74e0_4467_b975_f8199ef104f6.slice/crio-04324985a36d7b71a75ce4ac8fac56a1c0713af1ff927d823d42564f9856c18d WatchSource:0}: Error finding container 04324985a36d7b71a75ce4ac8fac56a1c0713af1ff927d823d42564f9856c18d: Status 404 returned error can't find the container with id 04324985a36d7b71a75ce4ac8fac56a1c0713af1ff927d823d42564f9856c18d Apr 18 02:49:17.929886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:17.929847 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5984ff74-hldrv" event={"ID":"0859c2d2-74e0-4467-b975-f8199ef104f6","Type":"ContainerStarted","Data":"04324985a36d7b71a75ce4ac8fac56a1c0713af1ff927d823d42564f9856c18d"} Apr 18 02:49:20.942062 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:20.942023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5984ff74-hldrv" event={"ID":"0859c2d2-74e0-4467-b975-f8199ef104f6","Type":"ContainerStarted","Data":"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c"} Apr 18 02:49:20.959188 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:20.959135 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5984ff74-hldrv" podStartSLOduration=2.253275047 podStartE2EDuration="4.959120246s" podCreationTimestamp="2026-04-18 02:49:16 +0000 UTC" firstStartedPulling="2026-04-18 02:49:17.185403433 +0000 UTC m=+200.325254746" lastFinishedPulling="2026-04-18 02:49:19.891248631 +0000 UTC m=+203.031099945" observedRunningTime="2026-04-18 02:49:20.956999033 +0000 UTC m=+204.096850369" watchObservedRunningTime="2026-04-18 02:49:20.959120246 +0000 UTC m=+204.098971581" Apr 18 02:49:23.083987 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:23.083946 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:23.084385 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:23.084018 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:27.063271 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.063230 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:27.063271 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.063281 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:27.072299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.072275 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:27.323543 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.323467 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:49:27.329183 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.329163 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.333345 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.333319 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:49:27.337224 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.337201 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 18 02:49:27.460574 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94vd\" (UniqueName: \"kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460876 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.460876 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.460816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561633 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n94vd\" (UniqueName: \"kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.561820 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.561821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.562472 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.562446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.562730 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.562707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.563145 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.563124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.563711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.563690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.563928 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.563912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.564573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.564537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.569568 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.569522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94vd\" (UniqueName: \"kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd\") pod \"console-5d8bcb6fdc-rj7km\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.639085 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.639000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:27.756856 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.756822 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:49:27.761531 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:49:27.761505 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f3ee37_9c6a_4b8c_871a_7ce246fc7c87.slice/crio-d40aa46b4d76772a3dfea8bc0f27721b81e8c8f7ca6f0866e9836f09a090d8c6 WatchSource:0}: Error finding container d40aa46b4d76772a3dfea8bc0f27721b81e8c8f7ca6f0866e9836f09a090d8c6: Status 404 returned error can't find the container with id d40aa46b4d76772a3dfea8bc0f27721b81e8c8f7ca6f0866e9836f09a090d8c6 Apr 18 02:49:27.963279 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.963182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8bcb6fdc-rj7km" event={"ID":"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87","Type":"ContainerStarted","Data":"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57"} Apr 18 02:49:27.963279 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.963220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8bcb6fdc-rj7km" event={"ID":"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87","Type":"ContainerStarted","Data":"d40aa46b4d76772a3dfea8bc0f27721b81e8c8f7ca6f0866e9836f09a090d8c6"} Apr 18 02:49:27.967099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.967078 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:49:27.979247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:27.979205 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d8bcb6fdc-rj7km" podStartSLOduration=0.979192612 podStartE2EDuration="979.192612ms" podCreationTimestamp="2026-04-18 02:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:49:27.978470022 +0000 UTC m=+211.118321357" watchObservedRunningTime="2026-04-18 02:49:27.979192612 +0000 UTC m=+211.119043946" Apr 18 02:49:36.988654 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:36.988618 2577 generic.go:358] "Generic (PLEG): container finished" podID="7920e7d2-3418-454b-9269-8f13a0c96d2d" containerID="d39d21178798cafc7631f51f291fb3c6db044c4c5ddfcffb68ef4d82682d654e" exitCode=0 Apr 18 02:49:36.989045 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:36.988690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-25mgk" event={"ID":"7920e7d2-3418-454b-9269-8f13a0c96d2d","Type":"ContainerDied","Data":"d39d21178798cafc7631f51f291fb3c6db044c4c5ddfcffb68ef4d82682d654e"} Apr 18 02:49:36.989092 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:36.989061 2577 scope.go:117] "RemoveContainer" containerID="d39d21178798cafc7631f51f291fb3c6db044c4c5ddfcffb68ef4d82682d654e" Apr 18 02:49:37.639494 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:37.639454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:37.639494 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:37.639500 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:37.644390 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:37.644368 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:37.995275 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:37.995228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-25mgk" event={"ID":"7920e7d2-3418-454b-9269-8f13a0c96d2d","Type":"ContainerStarted","Data":"f8f9cababbd72edee6763bdd6827d49facc89a0f361d51b128f2967f2f710495"} Apr 18 02:49:38.002832 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:38.002811 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:49:38.058216 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:38.058190 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:49:39.720633 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:39.720592 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" containerID="cri-o://9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2" gracePeriod=30 Apr 18 02:49:39.796306 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:39.796272 2577 patch_prober.go:28] interesting pod/image-registry-7f9c87579f-2s7d5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.133.0.6:5000/healthz\": dial tcp 10.133.0.6:5000: connect: connection refused" start-of-body= Apr 18 02:49:39.796462 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:39.796331 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" probeResult="failure" output="Get \"https://10.133.0.6:5000/healthz\": dial tcp 10.133.0.6:5000: connect: connection refused" Apr 18 02:49:40.960444 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:40.960414 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:49:41.005562 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.005522 2577 generic.go:358] "Generic (PLEG): container finished" podID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerID="9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2" exitCode=0 Apr 18 02:49:41.005766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.005603 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" Apr 18 02:49:41.005766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.005613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" event={"ID":"f5f88955-bfd0-4343-b6c6-2a18dd0d1149","Type":"ContainerDied","Data":"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2"} Apr 18 02:49:41.005766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.005654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f9c87579f-2s7d5" event={"ID":"f5f88955-bfd0-4343-b6c6-2a18dd0d1149","Type":"ContainerDied","Data":"6b049e3894252bb2990c51d5021e309206af36f12c6bdc0a80b76af2558160ce"} Apr 18 02:49:41.005766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.005671 2577 scope.go:117] "RemoveContainer" containerID="9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2" Apr 18 02:49:41.013330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.013311 2577 scope.go:117] "RemoveContainer" containerID="9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2" Apr 18 02:49:41.013619 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:49:41.013597 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2\": container with ID starting with 9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2 not found: ID does not exist" containerID="9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2" Apr 18 02:49:41.013666 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.013630 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2"} err="failed to get container status \"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2\": rpc error: code = NotFound desc = could not find container \"9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2\": container with ID starting with 9a61a0327692e040e1c3174a2969a18c0b56b35f6b012f348e96d940163aa8f2 not found: ID does not exist" Apr 18 02:49:41.084799 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084713 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.084799 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084756 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.084799 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084775 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084886 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj9h4\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084927 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.084982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token\") pod \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\" (UID: \"f5f88955-bfd0-4343-b6c6-2a18dd0d1149\") " Apr 18 02:49:41.085408 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.085345 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:49:41.085572 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.085415 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:49:41.087604 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.087573 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:49:41.087814 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.087786 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:49:41.087814 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.087803 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:49:41.087987 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.087785 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:49:41.087987 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.087867 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4" (OuterVolumeSpecName: "kube-api-access-nj9h4") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "kube-api-access-nj9h4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:49:41.094142 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.094113 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f5f88955-bfd0-4343-b6c6-2a18dd0d1149" (UID: "f5f88955-bfd0-4343-b6c6-2a18dd0d1149"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:49:41.185805 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185768 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-trusted-ca\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.185805 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185801 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-certificates\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.185805 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185812 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nj9h4\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-kube-api-access-nj9h4\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.186001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185822 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-installation-pull-secrets\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.186001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185831 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-bound-sa-token\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.186001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185839 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-ca-trust-extracted\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.186001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185848 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-registry-tls\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.186001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.185858 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5f88955-bfd0-4343-b6c6-2a18dd0d1149-image-registry-private-configuration\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:49:41.325741 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.325709 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:49:41.331045 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.331018 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7f9c87579f-2s7d5"] Apr 18 02:49:41.362596 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:41.362508 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" path="/var/lib/kubelet/pods/f5f88955-bfd0-4343-b6c6-2a18dd0d1149/volumes" Apr 18 02:49:43.089305 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:43.089278 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:49:43.093194 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:49:43.093170 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-757669dd5c-9qwfl" Apr 18 02:50:03.081923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.081857 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f5984ff74-hldrv" podUID="0859c2d2-74e0-4467-b975-f8199ef104f6" containerName="console" containerID="cri-o://4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c" gracePeriod=15 Apr 18 02:50:03.324358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.324337 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5984ff74-hldrv_0859c2d2-74e0-4467-b975-f8199ef104f6/console/0.log" Apr 18 02:50:03.324464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.324396 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:50:03.369958 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.369894 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.369958 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.369930 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.369958 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.369957 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.370150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.369975 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx6tm\" (UniqueName: \"kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.370150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.370012 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.370150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.370042 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert\") pod \"0859c2d2-74e0-4467-b975-f8199ef104f6\" (UID: \"0859c2d2-74e0-4467-b975-f8199ef104f6\") " Apr 18 02:50:03.370356 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.370322 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:03.370459 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.370371 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config" (OuterVolumeSpecName: "console-config") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:03.370459 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.370382 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:03.372185 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.372156 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:03.372269 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.372202 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:03.372269 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.372213 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm" (OuterVolumeSpecName: "kube-api-access-dx6tm") pod "0859c2d2-74e0-4467-b975-f8199ef104f6" (UID: "0859c2d2-74e0-4467-b975-f8199ef104f6"). InnerVolumeSpecName "kube-api-access-dx6tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:50:03.471353 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471325 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-service-ca\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:03.471353 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471351 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-oauth-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:03.471503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471364 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-console-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:03.471503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471377 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dx6tm\" (UniqueName: \"kubernetes.io/projected/0859c2d2-74e0-4467-b975-f8199ef104f6-kube-api-access-dx6tm\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:03.471503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471390 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0859c2d2-74e0-4467-b975-f8199ef104f6-oauth-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:03.471503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:03.471403 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0859c2d2-74e0-4467-b975-f8199ef104f6-console-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:50:04.070212 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5984ff74-hldrv_0859c2d2-74e0-4467-b975-f8199ef104f6/console/0.log" Apr 18 02:50:04.070404 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070223 2577 generic.go:358] "Generic (PLEG): container finished" podID="0859c2d2-74e0-4467-b975-f8199ef104f6" containerID="4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c" exitCode=2 Apr 18 02:50:04.070404 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070254 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5984ff74-hldrv" event={"ID":"0859c2d2-74e0-4467-b975-f8199ef104f6","Type":"ContainerDied","Data":"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c"} Apr 18 02:50:04.070404 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5984ff74-hldrv" event={"ID":"0859c2d2-74e0-4467-b975-f8199ef104f6","Type":"ContainerDied","Data":"04324985a36d7b71a75ce4ac8fac56a1c0713af1ff927d823d42564f9856c18d"} Apr 18 02:50:04.070404 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070307 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5984ff74-hldrv" Apr 18 02:50:04.070404 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.070317 2577 scope.go:117] "RemoveContainer" containerID="4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c" Apr 18 02:50:04.078129 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.078112 2577 scope.go:117] "RemoveContainer" containerID="4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c" Apr 18 02:50:04.078379 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:50:04.078358 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c\": container with ID starting with 4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c not found: ID does not exist" containerID="4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c" Apr 18 02:50:04.078440 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.078390 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c"} err="failed to get container status \"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c\": rpc error: code = NotFound desc = could not find container \"4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c\": container with ID starting with 4c0b5f6c69fe653b2fc97ab6e5b1890d1f503d406273d16c6a86e0ccf7322d9c not found: ID does not exist" Apr 18 02:50:04.092095 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.092074 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:50:04.095522 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:04.095501 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f5984ff74-hldrv"] Apr 18 02:50:05.362458 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:05.362426 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0859c2d2-74e0-4467-b975-f8199ef104f6" path="/var/lib/kubelet/pods/0859c2d2-74e0-4467-b975-f8199ef104f6/volumes" Apr 18 02:50:09.217782 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:09.217745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:50:09.220026 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:09.219996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64806518-b360-4104-92e5-8a3017ab382a-metrics-certs\") pod \"network-metrics-daemon-6xc88\" (UID: \"64806518-b360-4104-92e5-8a3017ab382a\") " pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:50:09.261625 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:09.261598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:50:09.269447 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:09.269433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6xc88" Apr 18 02:50:09.590030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:09.590005 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6xc88"] Apr 18 02:50:09.594648 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:50:09.592811 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64806518_b360_4104_92e5_8a3017ab382a.slice/crio-a6d0c4da70e96beeff261da3a185371272d4cbe2314ae77b9a37d9bda5d096ac WatchSource:0}: Error finding container a6d0c4da70e96beeff261da3a185371272d4cbe2314ae77b9a37d9bda5d096ac: Status 404 returned error can't find the container with id a6d0c4da70e96beeff261da3a185371272d4cbe2314ae77b9a37d9bda5d096ac Apr 18 02:50:10.089003 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:10.088955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6xc88" event={"ID":"64806518-b360-4104-92e5-8a3017ab382a","Type":"ContainerStarted","Data":"a6d0c4da70e96beeff261da3a185371272d4cbe2314ae77b9a37d9bda5d096ac"} Apr 18 02:50:11.093110 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:11.093074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6xc88" event={"ID":"64806518-b360-4104-92e5-8a3017ab382a","Type":"ContainerStarted","Data":"6d03e5d94ade536288e837f0eb248f960552e3affed83d224224f8b250fa3b9c"} Apr 18 02:50:11.093110 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:11.093112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6xc88" event={"ID":"64806518-b360-4104-92e5-8a3017ab382a","Type":"ContainerStarted","Data":"1e2cfb830a7641f4f19758b6226336e653e83b892f78a6a1db220827b8f7d4f2"} Apr 18 02:50:11.111917 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:11.111846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6xc88" podStartSLOduration=253.171181834 podStartE2EDuration="4m14.111833557s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:50:09.596126489 +0000 UTC m=+252.735977802" lastFinishedPulling="2026-04-18 02:50:10.536778207 +0000 UTC m=+253.676629525" observedRunningTime="2026-04-18 02:50:11.111123044 +0000 UTC m=+254.250974380" watchObservedRunningTime="2026-04-18 02:50:11.111833557 +0000 UTC m=+254.251684892" Apr 18 02:50:26.295512 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295435 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:50:26.295887 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295816 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" Apr 18 02:50:26.295887 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295832 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" Apr 18 02:50:26.295887 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295844 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0859c2d2-74e0-4467-b975-f8199ef104f6" containerName="console" Apr 18 02:50:26.295887 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295849 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0859c2d2-74e0-4467-b975-f8199ef104f6" containerName="console" Apr 18 02:50:26.296023 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295911 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5f88955-bfd0-4343-b6c6-2a18dd0d1149" containerName="registry" Apr 18 02:50:26.296023 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.295922 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0859c2d2-74e0-4467-b975-f8199ef104f6" containerName="console" Apr 18 02:50:26.301185 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.301164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.311741 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.311719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:50:26.458158 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458353 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmfjs\" (UniqueName: \"kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458353 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458478 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458478 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458617 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.458617 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.458522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559252 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559252 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmfjs\" (UniqueName: \"kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559252 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559609 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559609 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559609 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559609 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.559930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.559905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.560180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.560157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.560290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.560264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.560505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.560487 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.561809 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.561789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.561890 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.561809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.567282 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.567261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmfjs\" (UniqueName: \"kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs\") pod \"console-56586b889b-kxhpx\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.610818 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.610787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:26.732487 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:26.732336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:50:26.735168 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:50:26.735125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeafde854_6e62_4738_b2cf_f38ffb6dbf47.slice/crio-995a8d7c47d64938181dcb1fde32bc23d20dc43731ef91a472e135c84bcfd477 WatchSource:0}: Error finding container 995a8d7c47d64938181dcb1fde32bc23d20dc43731ef91a472e135c84bcfd477: Status 404 returned error can't find the container with id 995a8d7c47d64938181dcb1fde32bc23d20dc43731ef91a472e135c84bcfd477 Apr 18 02:50:27.142139 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:27.142055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56586b889b-kxhpx" event={"ID":"eafde854-6e62-4738-b2cf-f38ffb6dbf47","Type":"ContainerStarted","Data":"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee"} Apr 18 02:50:27.142139 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:27.142089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56586b889b-kxhpx" event={"ID":"eafde854-6e62-4738-b2cf-f38ffb6dbf47","Type":"ContainerStarted","Data":"995a8d7c47d64938181dcb1fde32bc23d20dc43731ef91a472e135c84bcfd477"} Apr 18 02:50:27.157258 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:27.157212 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56586b889b-kxhpx" podStartSLOduration=1.157197515 podStartE2EDuration="1.157197515s" podCreationTimestamp="2026-04-18 02:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:50:27.156752129 +0000 UTC m=+270.296603467" watchObservedRunningTime="2026-04-18 02:50:27.157197515 +0000 UTC m=+270.297048850" Apr 18 02:50:36.611293 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:36.611246 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:36.611293 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:36.611304 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:36.616154 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:36.616126 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:37.174711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:37.174682 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:50:37.215374 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:37.215340 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:50:57.260823 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:57.260795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:50:57.263472 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:57.263450 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:50:57.264619 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:57.264602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:50:57.267324 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:57.267305 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:50:57.270933 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:50:57.270914 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 18 02:51:02.238107 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.238055 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d8bcb6fdc-rj7km" podUID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" containerName="console" containerID="cri-o://7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57" gracePeriod=15 Apr 18 02:51:02.475832 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.475805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d8bcb6fdc-rj7km_f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87/console/0.log" Apr 18 02:51:02.475933 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.475864 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:51:02.529076 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529007 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529076 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529049 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529076 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529074 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n94vd\" (UniqueName: \"kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529109 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529147 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529171 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529196 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle\") pod \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\" (UID: \"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87\") " Apr 18 02:51:02.529534 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529406 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca" (OuterVolumeSpecName: "service-ca") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:51:02.529620 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529591 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config" (OuterVolumeSpecName: "console-config") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:51:02.529672 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529627 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:51:02.529778 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.529757 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:51:02.531109 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.531081 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:51:02.531208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.531122 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:51:02.531208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.531155 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd" (OuterVolumeSpecName: "kube-api-access-n94vd") pod "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" (UID: "f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87"). InnerVolumeSpecName "kube-api-access-n94vd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:51:02.630516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630486 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630509 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-oauth-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630521 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-oauth-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630531 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-trusted-ca-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630542 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-service-ca\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630569 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-console-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:02.630724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:02.630578 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n94vd\" (UniqueName: \"kubernetes.io/projected/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87-kube-api-access-n94vd\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:51:03.240099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240072 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d8bcb6fdc-rj7km_f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87/console/0.log" Apr 18 02:51:03.240483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240112 2577 generic.go:358] "Generic (PLEG): container finished" podID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" containerID="7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57" exitCode=2 Apr 18 02:51:03.240483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8bcb6fdc-rj7km" event={"ID":"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87","Type":"ContainerDied","Data":"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57"} Apr 18 02:51:03.240483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240175 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8bcb6fdc-rj7km" Apr 18 02:51:03.240483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8bcb6fdc-rj7km" event={"ID":"f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87","Type":"ContainerDied","Data":"d40aa46b4d76772a3dfea8bc0f27721b81e8c8f7ca6f0866e9836f09a090d8c6"} Apr 18 02:51:03.240483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.240197 2577 scope.go:117] "RemoveContainer" containerID="7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57" Apr 18 02:51:03.248331 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.248309 2577 scope.go:117] "RemoveContainer" containerID="7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57" Apr 18 02:51:03.250704 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:51:03.248738 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57\": container with ID starting with 7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57 not found: ID does not exist" containerID="7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57" Apr 18 02:51:03.250704 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.248772 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57"} err="failed to get container status \"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57\": rpc error: code = NotFound desc = could not find container \"7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57\": container with ID starting with 7f0989e08153f7077b8ec5278a0973dfdac71d098bfd4a32b5fe10ce68bdfc57 not found: ID does not exist" Apr 18 02:51:03.260042 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.260021 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:51:03.263210 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.263189 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d8bcb6fdc-rj7km"] Apr 18 02:51:03.362315 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:03.362285 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" path="/var/lib/kubelet/pods/f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87/volumes" Apr 18 02:51:48.456972 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.456879 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:51:48.457395 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.457274 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" containerName="console" Apr 18 02:51:48.457395 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.457290 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" containerName="console" Apr 18 02:51:48.457395 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.457363 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6f3ee37-9c6a-4b8c-871a-7ce246fc7c87" containerName="console" Apr 18 02:51:48.460359 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.460339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.469868 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.469846 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:51:48.565808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.565925 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.565925 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.565925 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.566030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.566030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.565959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.566030 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.566006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdrv\" (UniqueName: \"kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdrv\" (UniqueName: \"kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667470 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.667997 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.667974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.668112 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.668091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.668155 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.668113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.668155 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.668130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.669460 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.669436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.669647 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.669631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.675106 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.675079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdrv\" (UniqueName: \"kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv\") pod \"console-99fbf494f-hl6gk\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.771445 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.771413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:48.883703 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.883671 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:51:48.886850 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:51:48.886822 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c97bd2_5b45_441e_84a0_6c8527e6691b.slice/crio-8b351aee3f2ad0e90c549b2efc29d6ce3ff13051da23c3a5421224c627a82900 WatchSource:0}: Error finding container 8b351aee3f2ad0e90c549b2efc29d6ce3ff13051da23c3a5421224c627a82900: Status 404 returned error can't find the container with id 8b351aee3f2ad0e90c549b2efc29d6ce3ff13051da23c3a5421224c627a82900 Apr 18 02:51:48.888689 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:48.888666 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:51:49.363481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:49.363452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99fbf494f-hl6gk" event={"ID":"51c97bd2-5b45-441e-84a0-6c8527e6691b","Type":"ContainerStarted","Data":"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e"} Apr 18 02:51:49.363481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:49.363481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99fbf494f-hl6gk" event={"ID":"51c97bd2-5b45-441e-84a0-6c8527e6691b","Type":"ContainerStarted","Data":"8b351aee3f2ad0e90c549b2efc29d6ce3ff13051da23c3a5421224c627a82900"} Apr 18 02:51:49.379149 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:49.379109 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-99fbf494f-hl6gk" podStartSLOduration=1.379095755 podStartE2EDuration="1.379095755s" podCreationTimestamp="2026-04-18 02:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:51:49.378367796 +0000 UTC m=+352.518219133" watchObservedRunningTime="2026-04-18 02:51:49.379095755 +0000 UTC m=+352.518947089" Apr 18 02:51:58.771766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:58.771732 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:58.771766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:58.771767 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:58.776570 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:58.776534 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:59.392298 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:59.392265 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:51:59.432681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:51:59.432649 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:52:24.453652 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.453578 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56586b889b-kxhpx" podUID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" containerName="console" containerID="cri-o://4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee" gracePeriod=15 Apr 18 02:52:24.693059 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.693031 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56586b889b-kxhpx_eafde854-6e62-4738-b2cf-f38ffb6dbf47/console/0.log" Apr 18 02:52:24.693191 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.693094 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:52:24.727952 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.727855 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmfjs\" (UniqueName: \"kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.727952 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.727906 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.727952 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.727928 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.728230 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728034 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.728230 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728097 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.728230 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728139 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.728230 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728194 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config\") pod \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\" (UID: \"eafde854-6e62-4738-b2cf-f38ffb6dbf47\") " Apr 18 02:52:24.728412 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728303 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca" (OuterVolumeSpecName: "service-ca") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:52:24.728454 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728426 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-service-ca\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.728505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728458 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:52:24.728624 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728586 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config" (OuterVolumeSpecName: "console-config") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:52:24.728693 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.728651 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:52:24.730172 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.730139 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:52:24.730270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.730172 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:52:24.730270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.730251 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs" (OuterVolumeSpecName: "kube-api-access-rmfjs") pod "eafde854-6e62-4738-b2cf-f38ffb6dbf47" (UID: "eafde854-6e62-4738-b2cf-f38ffb6dbf47"). InnerVolumeSpecName "kube-api-access-rmfjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:52:24.828971 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.828937 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmfjs\" (UniqueName: \"kubernetes.io/projected/eafde854-6e62-4738-b2cf-f38ffb6dbf47-kube-api-access-rmfjs\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.828971 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.828968 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-oauth-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.828971 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.828978 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.829219 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.828987 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-trusted-ca-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.829219 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.828999 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:24.829219 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:24.829015 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafde854-6e62-4738-b2cf-f38ffb6dbf47-console-oauth-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:25.459269 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56586b889b-kxhpx_eafde854-6e62-4738-b2cf-f38ffb6dbf47/console/0.log" Apr 18 02:52:25.459755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459275 2577 generic.go:358] "Generic (PLEG): container finished" podID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" containerID="4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee" exitCode=2 Apr 18 02:52:25.459755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56586b889b-kxhpx" event={"ID":"eafde854-6e62-4738-b2cf-f38ffb6dbf47","Type":"ContainerDied","Data":"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee"} Apr 18 02:52:25.459755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56586b889b-kxhpx" event={"ID":"eafde854-6e62-4738-b2cf-f38ffb6dbf47","Type":"ContainerDied","Data":"995a8d7c47d64938181dcb1fde32bc23d20dc43731ef91a472e135c84bcfd477"} Apr 18 02:52:25.459755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459362 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56586b889b-kxhpx" Apr 18 02:52:25.459755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.459370 2577 scope.go:117] "RemoveContainer" containerID="4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee" Apr 18 02:52:25.467388 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.467370 2577 scope.go:117] "RemoveContainer" containerID="4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee" Apr 18 02:52:25.467631 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:52:25.467607 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee\": container with ID starting with 4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee not found: ID does not exist" containerID="4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee" Apr 18 02:52:25.467723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.467637 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee"} err="failed to get container status \"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee\": rpc error: code = NotFound desc = could not find container \"4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee\": container with ID starting with 4852d77dc26b4755576ec0f88cadca26ef9fda405c73d90cce9468bdaa6f05ee not found: ID does not exist" Apr 18 02:52:25.474224 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.474204 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:52:25.477384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:25.477367 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56586b889b-kxhpx"] Apr 18 02:52:27.362981 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:27.362945 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" path="/var/lib/kubelet/pods/eafde854-6e62-4738-b2cf-f38ffb6dbf47/volumes" Apr 18 02:52:39.023049 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.023014 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw"] Apr 18 02:52:39.023466 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.023310 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" containerName="console" Apr 18 02:52:39.023466 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.023330 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" containerName="console" Apr 18 02:52:39.023466 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.023385 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eafde854-6e62-4738-b2cf-f38ffb6dbf47" containerName="console" Apr 18 02:52:39.027964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.027942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.030770 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.030741 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:52:39.030966 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.030793 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:52:39.030966 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.030815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:52:39.033203 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.033178 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw"] Apr 18 02:52:39.134136 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.134098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.134136 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.134140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wl66\" (UniqueName: \"kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.134372 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.134254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.235487 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.235455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wl66\" (UniqueName: \"kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.235695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.235514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.235695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.235574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.235900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.235883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.235969 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.235946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.243625 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.243594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wl66\" (UniqueName: \"kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.338458 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.338374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:39.460054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.460020 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw"] Apr 18 02:52:39.463780 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:52:39.463741 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6525670_e151_4371_b1cb_5716113da9d6.slice/crio-3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61 WatchSource:0}: Error finding container 3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61: Status 404 returned error can't find the container with id 3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61 Apr 18 02:52:39.497794 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:39.497759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerStarted","Data":"3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61"} Apr 18 02:52:44.514111 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:44.514077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerStarted","Data":"92c6e8b7aba90a999a6c236b2b06e205e664535d776d670a8ab2b02d888e8c11"} Apr 18 02:52:45.517777 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:45.517744 2577 generic.go:358] "Generic (PLEG): container finished" podID="a6525670-e151-4371-b1cb-5716113da9d6" containerID="92c6e8b7aba90a999a6c236b2b06e205e664535d776d670a8ab2b02d888e8c11" exitCode=0 Apr 18 02:52:45.518149 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:45.517793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerDied","Data":"92c6e8b7aba90a999a6c236b2b06e205e664535d776d670a8ab2b02d888e8c11"} Apr 18 02:52:47.525144 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:47.525112 2577 generic.go:358] "Generic (PLEG): container finished" podID="a6525670-e151-4371-b1cb-5716113da9d6" containerID="f97864bc63af238e030409b5e46da301ec496159b6a9adc1860745a6da2c4a09" exitCode=0 Apr 18 02:52:47.525485 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:47.525205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerDied","Data":"f97864bc63af238e030409b5e46da301ec496159b6a9adc1860745a6da2c4a09"} Apr 18 02:52:53.546116 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:53.546081 2577 generic.go:358] "Generic (PLEG): container finished" podID="a6525670-e151-4371-b1cb-5716113da9d6" containerID="e93c1fef1ae358a316a6492da31511351c3c37b755cc521bcc4acc783015d3e3" exitCode=0 Apr 18 02:52:53.546476 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:53.546141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerDied","Data":"e93c1fef1ae358a316a6492da31511351c3c37b755cc521bcc4acc783015d3e3"} Apr 18 02:52:54.664881 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.664858 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:54.762482 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.762441 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wl66\" (UniqueName: \"kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66\") pod \"a6525670-e151-4371-b1cb-5716113da9d6\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " Apr 18 02:52:54.762702 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.762522 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle\") pod \"a6525670-e151-4371-b1cb-5716113da9d6\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " Apr 18 02:52:54.762702 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.762542 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util\") pod \"a6525670-e151-4371-b1cb-5716113da9d6\" (UID: \"a6525670-e151-4371-b1cb-5716113da9d6\") " Apr 18 02:52:54.763144 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.763121 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle" (OuterVolumeSpecName: "bundle") pod "a6525670-e151-4371-b1cb-5716113da9d6" (UID: "a6525670-e151-4371-b1cb-5716113da9d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:52:54.764695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.764673 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66" (OuterVolumeSpecName: "kube-api-access-5wl66") pod "a6525670-e151-4371-b1cb-5716113da9d6" (UID: "a6525670-e151-4371-b1cb-5716113da9d6"). InnerVolumeSpecName "kube-api-access-5wl66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:52:54.767736 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.767705 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util" (OuterVolumeSpecName: "util") pod "a6525670-e151-4371-b1cb-5716113da9d6" (UID: "a6525670-e151-4371-b1cb-5716113da9d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:52:54.863780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.863698 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:54.863780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.863728 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6525670-e151-4371-b1cb-5716113da9d6-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:54.863780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:54.863737 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5wl66\" (UniqueName: \"kubernetes.io/projected/a6525670-e151-4371-b1cb-5716113da9d6-kube-api-access-5wl66\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:52:55.553136 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:55.553106 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" Apr 18 02:52:55.553299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:55.553104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56kznw" event={"ID":"a6525670-e151-4371-b1cb-5716113da9d6","Type":"ContainerDied","Data":"3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61"} Apr 18 02:52:55.553299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:52:55.553214 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3776993b9f9d82f653bd263374d841227e6ed47c94d9165487f4a534cb7feb61" Apr 18 02:53:01.266803 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.266770 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp"] Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267034 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="pull" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267044 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="pull" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267058 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="util" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267064 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="util" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267076 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="extract" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267081 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="extract" Apr 18 02:53:01.267351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.267125 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6525670-e151-4371-b1cb-5716113da9d6" containerName="extract" Apr 18 02:53:01.271819 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.271800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.274837 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.274809 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 18 02:53:01.274837 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.274816 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:53:01.274996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.274864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-rdd5l\"" Apr 18 02:53:01.281789 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.281771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp"] Apr 18 02:53:01.417313 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.417269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e03191-217b-404a-b672-601c1c983faa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.417313 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.417316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtjd\" (UniqueName: \"kubernetes.io/projected/f5e03191-217b-404a-b672-601c1c983faa-kube-api-access-qbtjd\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.518539 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.518465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e03191-217b-404a-b672-601c1c983faa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.518539 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.518529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtjd\" (UniqueName: \"kubernetes.io/projected/f5e03191-217b-404a-b672-601c1c983faa-kube-api-access-qbtjd\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.519471 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.519451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e03191-217b-404a-b672-601c1c983faa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.526034 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.525998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtjd\" (UniqueName: \"kubernetes.io/projected/f5e03191-217b-404a-b672-601c1c983faa-kube-api-access-qbtjd\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dcvzp\" (UID: \"f5e03191-217b-404a-b672-601c1c983faa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.580063 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.580016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" Apr 18 02:53:01.706711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:01.706683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp"] Apr 18 02:53:01.709501 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:01.709471 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e03191_217b_404a_b672_601c1c983faa.slice/crio-2dd94d335a6dcb5717be3aae386406edf30dbab0d9adc308861f95d3f3fdf05c WatchSource:0}: Error finding container 2dd94d335a6dcb5717be3aae386406edf30dbab0d9adc308861f95d3f3fdf05c: Status 404 returned error can't find the container with id 2dd94d335a6dcb5717be3aae386406edf30dbab0d9adc308861f95d3f3fdf05c Apr 18 02:53:02.573983 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:02.573947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" event={"ID":"f5e03191-217b-404a-b672-601c1c983faa","Type":"ContainerStarted","Data":"2dd94d335a6dcb5717be3aae386406edf30dbab0d9adc308861f95d3f3fdf05c"} Apr 18 02:53:04.582117 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:04.582082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" event={"ID":"f5e03191-217b-404a-b672-601c1c983faa","Type":"ContainerStarted","Data":"a348eba57265eb0bbfc7762f517bd9cfad0f91ebc7a6a777fdfae8cb4055bc83"} Apr 18 02:53:04.603819 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:04.603773 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dcvzp" podStartSLOduration=1.484283768 podStartE2EDuration="3.603761781s" podCreationTimestamp="2026-04-18 02:53:01 +0000 UTC" firstStartedPulling="2026-04-18 02:53:01.711979791 +0000 UTC m=+424.851831104" lastFinishedPulling="2026-04-18 02:53:03.831457804 +0000 UTC m=+426.971309117" observedRunningTime="2026-04-18 02:53:04.602414043 +0000 UTC m=+427.742265375" watchObservedRunningTime="2026-04-18 02:53:04.603761781 +0000 UTC m=+427.743613116" Apr 18 02:53:06.161002 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.160964 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g"] Apr 18 02:53:06.164459 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.164442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.166833 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.166813 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:53:06.166930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.166858 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:53:06.166930 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.166890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:53:06.170784 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.170763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g"] Apr 18 02:53:06.255306 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.255275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.255475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.255376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.255475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.255424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbj9f\" (UniqueName: \"kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.356180 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.356145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.356333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.356213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.356333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.356252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbj9f\" (UniqueName: \"kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.356605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.356581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.356650 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.356603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.365388 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.365366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbj9f\" (UniqueName: \"kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.474336 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.474233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:06.592345 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:06.592314 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g"] Apr 18 02:53:06.595109 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:06.595082 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85068248_0c21_4425_8bb4_9cc510e6fec0.slice/crio-b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93 WatchSource:0}: Error finding container b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93: Status 404 returned error can't find the container with id b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93 Apr 18 02:53:07.593708 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:07.593671 2577 generic.go:358] "Generic (PLEG): container finished" podID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerID="4653af0f4db56a7beb3e76a07bab531782d9acf4408bb6bae15ea20b2a8e48c7" exitCode=0 Apr 18 02:53:07.594159 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:07.593753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" event={"ID":"85068248-0c21-4425-8bb4-9cc510e6fec0","Type":"ContainerDied","Data":"4653af0f4db56a7beb3e76a07bab531782d9acf4408bb6bae15ea20b2a8e48c7"} Apr 18 02:53:07.594159 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:07.593796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" event={"ID":"85068248-0c21-4425-8bb4-9cc510e6fec0","Type":"ContainerStarted","Data":"b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93"} Apr 18 02:53:10.606157 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.606124 2577 generic.go:358] "Generic (PLEG): container finished" podID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerID="0a6bf2be9c24c4744347f15d0853a3332d90bf1e7efa42a215344329c4723e9e" exitCode=0 Apr 18 02:53:10.606520 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.606215 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" event={"ID":"85068248-0c21-4425-8bb4-9cc510e6fec0","Type":"ContainerDied","Data":"0a6bf2be9c24c4744347f15d0853a3332d90bf1e7efa42a215344329c4723e9e"} Apr 18 02:53:10.904388 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.904305 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t4wcg"] Apr 18 02:53:10.907561 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.907535 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:10.909981 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.909950 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 18 02:53:10.909981 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.909964 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 18 02:53:10.910126 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.909983 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-9hbfb\"" Apr 18 02:53:10.916070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.916045 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t4wcg"] Apr 18 02:53:10.997602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.997541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5j26\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-kube-api-access-s5j26\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:10.997742 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:10.997617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.098481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.098446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5j26\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-kube-api-access-s5j26\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.098481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.098486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.106196 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.106167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.106306 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.106171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5j26\" (UniqueName: \"kubernetes.io/projected/150df87f-998f-4c24-a63f-42e6a0eee6f2-kube-api-access-s5j26\") pod \"cert-manager-cainjector-8966b78d4-t4wcg\" (UID: \"150df87f-998f-4c24-a63f-42e6a0eee6f2\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.229619 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.229590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" Apr 18 02:53:11.343933 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.343905 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t4wcg"] Apr 18 02:53:11.346899 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:11.346868 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod150df87f_998f_4c24_a63f_42e6a0eee6f2.slice/crio-a380f1358c8c6586bf6744cf60df50840d3f76e8526b354c9df6fae1f48e6eaa WatchSource:0}: Error finding container a380f1358c8c6586bf6744cf60df50840d3f76e8526b354c9df6fae1f48e6eaa: Status 404 returned error can't find the container with id a380f1358c8c6586bf6744cf60df50840d3f76e8526b354c9df6fae1f48e6eaa Apr 18 02:53:11.611455 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.611374 2577 generic.go:358] "Generic (PLEG): container finished" podID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerID="e9e25e32e6468bcfe8dc10306c652ece5c60664293bd5c73c364be3fa7f77a41" exitCode=0 Apr 18 02:53:11.611872 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.611460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" event={"ID":"85068248-0c21-4425-8bb4-9cc510e6fec0","Type":"ContainerDied","Data":"e9e25e32e6468bcfe8dc10306c652ece5c60664293bd5c73c364be3fa7f77a41"} Apr 18 02:53:11.612499 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:11.612475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" event={"ID":"150df87f-998f-4c24-a63f-42e6a0eee6f2","Type":"ContainerStarted","Data":"a380f1358c8c6586bf6744cf60df50840d3f76e8526b354c9df6fae1f48e6eaa"} Apr 18 02:53:12.756726 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.756703 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:12.917150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.917057 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbj9f\" (UniqueName: \"kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f\") pod \"85068248-0c21-4425-8bb4-9cc510e6fec0\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " Apr 18 02:53:12.917150 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.917117 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle\") pod \"85068248-0c21-4425-8bb4-9cc510e6fec0\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " Apr 18 02:53:12.917384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.917219 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util\") pod \"85068248-0c21-4425-8bb4-9cc510e6fec0\" (UID: \"85068248-0c21-4425-8bb4-9cc510e6fec0\") " Apr 18 02:53:12.917610 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.917579 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle" (OuterVolumeSpecName: "bundle") pod "85068248-0c21-4425-8bb4-9cc510e6fec0" (UID: "85068248-0c21-4425-8bb4-9cc510e6fec0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:12.919645 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.919622 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f" (OuterVolumeSpecName: "kube-api-access-mbj9f") pod "85068248-0c21-4425-8bb4-9cc510e6fec0" (UID: "85068248-0c21-4425-8bb4-9cc510e6fec0"). InnerVolumeSpecName "kube-api-access-mbj9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:53:12.924348 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:12.924309 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util" (OuterVolumeSpecName: "util") pod "85068248-0c21-4425-8bb4-9cc510e6fec0" (UID: "85068248-0c21-4425-8bb4-9cc510e6fec0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:13.017965 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.017926 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbj9f\" (UniqueName: \"kubernetes.io/projected/85068248-0c21-4425-8bb4-9cc510e6fec0-kube-api-access-mbj9f\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:13.017965 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.017962 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:13.017965 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.017972 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85068248-0c21-4425-8bb4-9cc510e6fec0-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:13.622506 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.622414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" event={"ID":"85068248-0c21-4425-8bb4-9cc510e6fec0","Type":"ContainerDied","Data":"b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93"} Apr 18 02:53:13.622506 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.622453 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b52a2a3e187a1f2192406ea3bc61e10180ad91750c874ef2d18befa50d038a93" Apr 18 02:53:13.622506 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:13.622466 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f8kr2g" Apr 18 02:53:14.628592 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:14.628528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" event={"ID":"150df87f-998f-4c24-a63f-42e6a0eee6f2","Type":"ContainerStarted","Data":"a6c6871fa5e0f95f7b42fcdfbfe49c2ff505d297da53b5aebfde1090532ba749"} Apr 18 02:53:14.642897 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:14.642833 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-t4wcg" podStartSLOduration=1.883206753 podStartE2EDuration="4.642812118s" podCreationTimestamp="2026-04-18 02:53:10 +0000 UTC" firstStartedPulling="2026-04-18 02:53:11.348777151 +0000 UTC m=+434.488628468" lastFinishedPulling="2026-04-18 02:53:14.108382517 +0000 UTC m=+437.248233833" observedRunningTime="2026-04-18 02:53:14.641910931 +0000 UTC m=+437.781762265" watchObservedRunningTime="2026-04-18 02:53:14.642812118 +0000 UTC m=+437.782663455" Apr 18 02:53:16.381823 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.381787 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z"] Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382094 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="util" Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382107 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="util" Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382120 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="pull" Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382126 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="pull" Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382132 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="extract" Apr 18 02:53:16.382169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382138 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="extract" Apr 18 02:53:16.382356 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.382184 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="85068248-0c21-4425-8bb4-9cc510e6fec0" containerName="extract" Apr 18 02:53:16.384966 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.384950 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.387396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.387376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-nl924\"" Apr 18 02:53:16.387396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.387389 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 18 02:53:16.387563 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.387390 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:53:16.393152 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.393128 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z"] Apr 18 02:53:16.549031 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.548986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/96520594-5eda-4ca9-a837-b81f770fafd1-kube-api-access-pzbdf\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.549215 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.549147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96520594-5eda-4ca9-a837-b81f770fafd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.650598 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.650493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96520594-5eda-4ca9-a837-b81f770fafd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.650598 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.650541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/96520594-5eda-4ca9-a837-b81f770fafd1-kube-api-access-pzbdf\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.650919 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.650899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96520594-5eda-4ca9-a837-b81f770fafd1-tmp\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.657698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.657678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbdf\" (UniqueName: \"kubernetes.io/projected/96520594-5eda-4ca9-a837-b81f770fafd1-kube-api-access-pzbdf\") pod \"openshift-lws-operator-bfc7f696d-9mz7z\" (UID: \"96520594-5eda-4ca9-a837-b81f770fafd1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.693750 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.693715 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" Apr 18 02:53:16.812497 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:16.812466 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z"] Apr 18 02:53:16.814919 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:16.814891 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96520594_5eda_4ca9_a837_b81f770fafd1.slice/crio-b119f7bbd1b25279ebdce40b37faf14443042d6d64ddc311e2f2300b8a2f84f9 WatchSource:0}: Error finding container b119f7bbd1b25279ebdce40b37faf14443042d6d64ddc311e2f2300b8a2f84f9: Status 404 returned error can't find the container with id b119f7bbd1b25279ebdce40b37faf14443042d6d64ddc311e2f2300b8a2f84f9 Apr 18 02:53:17.639017 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:17.638983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" event={"ID":"96520594-5eda-4ca9-a837-b81f770fafd1","Type":"ContainerStarted","Data":"b119f7bbd1b25279ebdce40b37faf14443042d6d64ddc311e2f2300b8a2f84f9"} Apr 18 02:53:19.646693 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:19.646653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" event={"ID":"96520594-5eda-4ca9-a837-b81f770fafd1","Type":"ContainerStarted","Data":"a344b611e3deb5fb6ecf012826058574c923541654b63ea54f5288df3dcccf1b"} Apr 18 02:53:19.661522 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:19.661472 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9mz7z" podStartSLOduration=1.471185617 podStartE2EDuration="3.661458794s" podCreationTimestamp="2026-04-18 02:53:16 +0000 UTC" firstStartedPulling="2026-04-18 02:53:16.816378155 +0000 UTC m=+439.956229468" lastFinishedPulling="2026-04-18 02:53:19.006651321 +0000 UTC m=+442.146502645" observedRunningTime="2026-04-18 02:53:19.660277584 +0000 UTC m=+442.800128919" watchObservedRunningTime="2026-04-18 02:53:19.661458794 +0000 UTC m=+442.801310129" Apr 18 02:53:22.161608 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.161572 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p"] Apr 18 02:53:22.193282 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.193252 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p"] Apr 18 02:53:22.193419 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.193380 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.195747 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.195723 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:53:22.195870 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.195822 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:53:22.196714 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.196697 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:53:22.296211 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.296179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.296211 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.296213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6wf4\" (UniqueName: \"kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.296464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.296321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.397396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.397363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.397396 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.397398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wf4\" (UniqueName: \"kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.397634 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.397441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.397778 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.397759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.397817 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.397796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.405267 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.405240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6wf4\" (UniqueName: \"kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.502585 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.502538 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:22.639436 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.639403 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p"] Apr 18 02:53:22.642259 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:22.642226 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d88ffc_2e8f_4403_88dd_a8cd6b56da0f.slice/crio-c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905 WatchSource:0}: Error finding container c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905: Status 404 returned error can't find the container with id c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905 Apr 18 02:53:22.656069 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:22.656038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" event={"ID":"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f","Type":"ContainerStarted","Data":"c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905"} Apr 18 02:53:23.661051 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:23.660972 2577 generic.go:358] "Generic (PLEG): container finished" podID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerID="f6e662b12ab553fc83e409c9807ff049fb585fcb60a254d5ff4c6c3495921f1a" exitCode=0 Apr 18 02:53:23.661378 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:23.661098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" event={"ID":"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f","Type":"ContainerDied","Data":"f6e662b12ab553fc83e409c9807ff049fb585fcb60a254d5ff4c6c3495921f1a"} Apr 18 02:53:25.669589 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:25.669536 2577 generic.go:358] "Generic (PLEG): container finished" podID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerID="ce270fe31e5145d751f95eea784f7e6abfa5d54b4416f90ca7639e309efeda77" exitCode=0 Apr 18 02:53:25.669951 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:25.669622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" event={"ID":"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f","Type":"ContainerDied","Data":"ce270fe31e5145d751f95eea784f7e6abfa5d54b4416f90ca7639e309efeda77"} Apr 18 02:53:26.675065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:26.675032 2577 generic.go:358] "Generic (PLEG): container finished" podID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerID="0f2ebae6ae0c1b5b275b523393055f6145387a0de60321c444b49d3dc497a90a" exitCode=0 Apr 18 02:53:26.675438 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:26.675097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" event={"ID":"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f","Type":"ContainerDied","Data":"0f2ebae6ae0c1b5b275b523393055f6145387a0de60321c444b49d3dc497a90a"} Apr 18 02:53:27.795485 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.795451 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:27.848684 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.848646 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle\") pod \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " Apr 18 02:53:27.848849 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.848753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6wf4\" (UniqueName: \"kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4\") pod \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " Apr 18 02:53:27.848849 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.848779 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util\") pod \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\" (UID: \"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f\") " Apr 18 02:53:27.855059 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.851529 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4" (OuterVolumeSpecName: "kube-api-access-f6wf4") pod "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" (UID: "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f"). InnerVolumeSpecName "kube-api-access-f6wf4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:53:27.855920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.855886 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle" (OuterVolumeSpecName: "bundle") pod "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" (UID: "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:27.856563 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.856525 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util" (OuterVolumeSpecName: "util") pod "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" (UID: "e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:27.950000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.949905 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:27.950000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.949962 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6wf4\" (UniqueName: \"kubernetes.io/projected/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-kube-api-access-f6wf4\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:27.950000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:27.949976 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:28.683698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:28.683650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" event={"ID":"e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f","Type":"ContainerDied","Data":"c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905"} Apr 18 02:53:28.683698 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:28.683698 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3af3dbee9e3689ed8e138e324e8aecb95a8762cf7779ed7097936087a0c3905" Apr 18 02:53:28.683901 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:28.683726 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5vq85p" Apr 18 02:53:32.774635 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.774596 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8"] Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.774993 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="extract" Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775012 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="extract" Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775031 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="util" Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775039 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="util" Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775049 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="pull" Apr 18 02:53:32.775081 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775058 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="pull" Apr 18 02:53:32.775373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.775142 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1d88ffc-2e8f-4403-88dd-a8cd6b56da0f" containerName="extract" Apr 18 02:53:32.779825 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.779805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.782426 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.782402 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:53:32.783505 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.783486 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:53:32.783648 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.783632 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:53:32.785578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.785541 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8"] Apr 18 02:53:32.892188 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.892153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.892380 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.892204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.892380 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.892312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6pz\" (UniqueName: \"kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.993301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.993261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.993464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.993318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.993464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.993377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6pz\" (UniqueName: \"kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.993688 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.993669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:32.993748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:32.993697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:33.001961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.001933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6pz\" (UniqueName: \"kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:33.089331 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.089258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:33.216639 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.216605 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8"] Apr 18 02:53:33.219062 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:33.219032 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17989db8_6c33_4a49_943d_8cdfa1d0b16a.slice/crio-d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c WatchSource:0}: Error finding container d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c: Status 404 returned error can't find the container with id d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c Apr 18 02:53:33.702317 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.702283 2577 generic.go:358] "Generic (PLEG): container finished" podID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerID="27401f4fdc914037756a44f612d6ab0ff5d2964834391f5979b91153cef1e442" exitCode=0 Apr 18 02:53:33.702513 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.702365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" event={"ID":"17989db8-6c33-4a49-943d-8cdfa1d0b16a","Type":"ContainerDied","Data":"27401f4fdc914037756a44f612d6ab0ff5d2964834391f5979b91153cef1e442"} Apr 18 02:53:33.702513 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:33.702415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" event={"ID":"17989db8-6c33-4a49-943d-8cdfa1d0b16a","Type":"ContainerStarted","Data":"d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c"} Apr 18 02:53:34.634274 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.634187 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4"] Apr 18 02:53:34.637829 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.637813 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.640483 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.640452 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 18 02:53:34.640625 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.640481 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 18 02:53:34.640625 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.640451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 18 02:53:34.640625 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.640595 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 18 02:53:34.640771 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.640595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x27k\"" Apr 18 02:53:34.649277 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.649250 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4"] Apr 18 02:53:34.707173 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.707144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.707384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.707208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzz49\" (UniqueName: \"kubernetes.io/projected/f8222645-c589-4075-9ff6-ddaaaa73ed9f-kube-api-access-kzz49\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.707384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.707239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.707384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.707261 2577 generic.go:358] "Generic (PLEG): container finished" podID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerID="58a5aa5d31fd47b1c43cb38b32044544621868614369922f6c96375550c05130" exitCode=0 Apr 18 02:53:34.707384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.707326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" event={"ID":"17989db8-6c33-4a49-943d-8cdfa1d0b16a","Type":"ContainerDied","Data":"58a5aa5d31fd47b1c43cb38b32044544621868614369922f6c96375550c05130"} Apr 18 02:53:34.808275 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.808239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.808474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.808301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzz49\" (UniqueName: \"kubernetes.io/projected/f8222645-c589-4075-9ff6-ddaaaa73ed9f-kube-api-access-kzz49\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.808474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.808325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.811260 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.811223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.811413 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.811310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8222645-c589-4075-9ff6-ddaaaa73ed9f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.820723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.820696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzz49\" (UniqueName: \"kubernetes.io/projected/f8222645-c589-4075-9ff6-ddaaaa73ed9f-kube-api-access-kzz49\") pod \"opendatahub-operator-controller-manager-b6bf46549-x5vn4\" (UID: \"f8222645-c589-4075-9ff6-ddaaaa73ed9f\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:34.947908 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:34.947800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:35.084667 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:35.082581 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4"] Apr 18 02:53:35.086870 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:35.086840 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8222645_c589_4075_9ff6_ddaaaa73ed9f.slice/crio-d06a8b02daabc53ba320863b777874783f1ea391112fb7ba07e9922be403774c WatchSource:0}: Error finding container d06a8b02daabc53ba320863b777874783f1ea391112fb7ba07e9922be403774c: Status 404 returned error can't find the container with id d06a8b02daabc53ba320863b777874783f1ea391112fb7ba07e9922be403774c Apr 18 02:53:35.713953 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:35.713913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" event={"ID":"f8222645-c589-4075-9ff6-ddaaaa73ed9f","Type":"ContainerStarted","Data":"d06a8b02daabc53ba320863b777874783f1ea391112fb7ba07e9922be403774c"} Apr 18 02:53:35.716356 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:35.716323 2577 generic.go:358] "Generic (PLEG): container finished" podID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerID="8a061f4944d330fffa921c3e41cffe1d68e79dc820406e7cafea0ab732709e85" exitCode=0 Apr 18 02:53:35.716495 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:35.716368 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" event={"ID":"17989db8-6c33-4a49-943d-8cdfa1d0b16a","Type":"ContainerDied","Data":"8a061f4944d330fffa921c3e41cffe1d68e79dc820406e7cafea0ab732709e85"} Apr 18 02:53:37.573178 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.573157 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:37.632046 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.632020 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle\") pod \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " Apr 18 02:53:37.632160 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.632127 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util\") pod \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " Apr 18 02:53:37.632226 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.632180 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq6pz\" (UniqueName: \"kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz\") pod \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\" (UID: \"17989db8-6c33-4a49-943d-8cdfa1d0b16a\") " Apr 18 02:53:37.632830 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.632804 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle" (OuterVolumeSpecName: "bundle") pod "17989db8-6c33-4a49-943d-8cdfa1d0b16a" (UID: "17989db8-6c33-4a49-943d-8cdfa1d0b16a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:37.634198 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.634174 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz" (OuterVolumeSpecName: "kube-api-access-dq6pz") pod "17989db8-6c33-4a49-943d-8cdfa1d0b16a" (UID: "17989db8-6c33-4a49-943d-8cdfa1d0b16a"). InnerVolumeSpecName "kube-api-access-dq6pz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:53:37.637304 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.637275 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util" (OuterVolumeSpecName: "util") pod "17989db8-6c33-4a49-943d-8cdfa1d0b16a" (UID: "17989db8-6c33-4a49-943d-8cdfa1d0b16a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:37.726803 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.726767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" event={"ID":"f8222645-c589-4075-9ff6-ddaaaa73ed9f","Type":"ContainerStarted","Data":"ac3f6116701533208768a0153f3376440d72e5e17ad6b38d1e22c2665047438e"} Apr 18 02:53:37.726992 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.726853 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:37.728324 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.728300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" event={"ID":"17989db8-6c33-4a49-943d-8cdfa1d0b16a","Type":"ContainerDied","Data":"d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c"} Apr 18 02:53:37.728449 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.728324 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9746r8" Apr 18 02:53:37.728449 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.728326 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7036f8dd1273e018d32ac5974671137e93838bcc48365673a0ea74278f26a2c" Apr 18 02:53:37.732762 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.732743 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:37.732762 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.732762 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17989db8-6c33-4a49-943d-8cdfa1d0b16a-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:37.732947 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.732771 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dq6pz\" (UniqueName: \"kubernetes.io/projected/17989db8-6c33-4a49-943d-8cdfa1d0b16a-kube-api-access-dq6pz\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:37.747846 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:37.747801 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" podStartSLOduration=1.230144103 podStartE2EDuration="3.747785875s" podCreationTimestamp="2026-04-18 02:53:34 +0000 UTC" firstStartedPulling="2026-04-18 02:53:35.088969116 +0000 UTC m=+458.228820434" lastFinishedPulling="2026-04-18 02:53:37.606610893 +0000 UTC m=+460.746462206" observedRunningTime="2026-04-18 02:53:37.745779511 +0000 UTC m=+460.885630847" watchObservedRunningTime="2026-04-18 02:53:37.747785875 +0000 UTC m=+460.887637234" Apr 18 02:53:48.734739 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:48.734706 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-x5vn4" Apr 18 02:53:51.547293 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547257 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7"] Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547576 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="extract" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547590 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="extract" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547602 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="pull" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547607 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="pull" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547617 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="util" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547626 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="util" Apr 18 02:53:51.547723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.547724 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="17989db8-6c33-4a49-943d-8cdfa1d0b16a" containerName="extract" Apr 18 02:53:51.550773 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.550753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.553601 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.553568 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:53:51.553706 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.553663 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:53:51.554597 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.554581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:53:51.574458 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.574429 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7"] Apr 18 02:53:51.645984 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.645947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.646175 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.646026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.646175 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.646093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rfk\" (UniqueName: \"kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.746566 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.746518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.746746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.746602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.746746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.746677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8rfk\" (UniqueName: \"kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.746914 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.746892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.747001 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.746977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.755909 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.755882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8rfk\" (UniqueName: \"kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.859855 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.859765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:51.991048 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:51.991008 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7"] Apr 18 02:53:51.994770 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:51.994734 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52993123_c7d1_4911_9717_187be5046d3a.slice/crio-257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5 WatchSource:0}: Error finding container 257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5: Status 404 returned error can't find the container with id 257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5 Apr 18 02:53:52.782137 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.782099 2577 generic.go:358] "Generic (PLEG): container finished" podID="52993123-c7d1-4911-9717-187be5046d3a" containerID="951785db25fbd718ee7fd49b91236dea1383c1a0e6ca02bdac3b2d80864fee30" exitCode=0 Apr 18 02:53:52.782573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.782186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" event={"ID":"52993123-c7d1-4911-9717-187be5046d3a","Type":"ContainerDied","Data":"951785db25fbd718ee7fd49b91236dea1383c1a0e6ca02bdac3b2d80864fee30"} Apr 18 02:53:52.782573 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.782221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" event={"ID":"52993123-c7d1-4911-9717-187be5046d3a","Type":"ContainerStarted","Data":"257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5"} Apr 18 02:53:52.917693 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.917655 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-b22gm"] Apr 18 02:53:52.921724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.921702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:52.924051 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.924032 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4ls7q\"" Apr 18 02:53:52.924168 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.924091 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 18 02:53:52.924168 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.924095 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 18 02:53:52.930281 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:52.930258 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-b22gm"] Apr 18 02:53:53.059964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.059867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9456592c-61cf-4a59-820e-7c061a014993-tls-certs\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.059964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.059912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9456592c-61cf-4a59-820e-7c061a014993-tmp\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.060164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.060041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhf8\" (UniqueName: \"kubernetes.io/projected/9456592c-61cf-4a59-820e-7c061a014993-kube-api-access-fwhf8\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.160934 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.160893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9456592c-61cf-4a59-820e-7c061a014993-tls-certs\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.160934 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.160939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9456592c-61cf-4a59-820e-7c061a014993-tmp\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.161176 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.161010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhf8\" (UniqueName: \"kubernetes.io/projected/9456592c-61cf-4a59-820e-7c061a014993-kube-api-access-fwhf8\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.163333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.163305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9456592c-61cf-4a59-820e-7c061a014993-tmp\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.163333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.163326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9456592c-61cf-4a59-820e-7c061a014993-tls-certs\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.167861 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.167838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhf8\" (UniqueName: \"kubernetes.io/projected/9456592c-61cf-4a59-820e-7c061a014993-kube-api-access-fwhf8\") pod \"kube-auth-proxy-79f769b49f-b22gm\" (UID: \"9456592c-61cf-4a59-820e-7c061a014993\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.233091 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.233069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" Apr 18 02:53:53.347594 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.347567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-b22gm"] Apr 18 02:53:53.350026 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:53.349995 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9456592c_61cf_4a59_820e_7c061a014993.slice/crio-b04e6fa0cf4d360f6f36f1b31f6455b5c387d1142304f83bb56ad255671ff67f WatchSource:0}: Error finding container b04e6fa0cf4d360f6f36f1b31f6455b5c387d1142304f83bb56ad255671ff67f: Status 404 returned error can't find the container with id b04e6fa0cf4d360f6f36f1b31f6455b5c387d1142304f83bb56ad255671ff67f Apr 18 02:53:53.787189 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.787147 2577 generic.go:358] "Generic (PLEG): container finished" podID="52993123-c7d1-4911-9717-187be5046d3a" containerID="990fa144afa0281f02ec6d13da5b2bef679ac2bb4bb2cbebd34a920fad875482" exitCode=0 Apr 18 02:53:53.787592 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.787190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" event={"ID":"52993123-c7d1-4911-9717-187be5046d3a","Type":"ContainerDied","Data":"990fa144afa0281f02ec6d13da5b2bef679ac2bb4bb2cbebd34a920fad875482"} Apr 18 02:53:53.788422 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:53.788398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" event={"ID":"9456592c-61cf-4a59-820e-7c061a014993","Type":"ContainerStarted","Data":"b04e6fa0cf4d360f6f36f1b31f6455b5c387d1142304f83bb56ad255671ff67f"} Apr 18 02:53:54.794235 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:54.794194 2577 generic.go:358] "Generic (PLEG): container finished" podID="52993123-c7d1-4911-9717-187be5046d3a" containerID="bb06f41e22e94594d0200c01865d578b885416ee810e9e7658a2ad4adcf2d452" exitCode=0 Apr 18 02:53:54.794701 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:54.794352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" event={"ID":"52993123-c7d1-4911-9717-187be5046d3a","Type":"ContainerDied","Data":"bb06f41e22e94594d0200c01865d578b885416ee810e9e7658a2ad4adcf2d452"} Apr 18 02:53:56.076846 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.076761 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-djdc9"] Apr 18 02:53:56.081493 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.081472 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.083747 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.083725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 18 02:53:56.083863 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.083851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-tnqw4\"" Apr 18 02:53:56.089446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.089424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-djdc9"] Apr 18 02:53:56.187043 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.187004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.187228 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.187071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jb49\" (UniqueName: \"kubernetes.io/projected/0f7c1882-395f-4300-aeaa-4c83728c6e2e-kube-api-access-7jb49\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.287755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.287716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.287964 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.287763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jb49\" (UniqueName: \"kubernetes.io/projected/0f7c1882-395f-4300-aeaa-4c83728c6e2e-kube-api-access-7jb49\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.287964 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:53:56.287898 2577 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 18 02:53:56.288091 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:53:56.287986 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert podName:0f7c1882-395f-4300-aeaa-4c83728c6e2e nodeName:}" failed. No retries permitted until 2026-04-18 02:53:56.787961473 +0000 UTC m=+479.927812789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert") pod "odh-model-controller-858dbf95b8-djdc9" (UID: "0f7c1882-395f-4300-aeaa-4c83728c6e2e") : secret "odh-model-controller-webhook-cert" not found Apr 18 02:53:56.296578 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.296536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jb49\" (UniqueName: \"kubernetes.io/projected/0f7c1882-395f-4300-aeaa-4c83728c6e2e-kube-api-access-7jb49\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.321436 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.321419 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:56.489414 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.489387 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util\") pod \"52993123-c7d1-4911-9717-187be5046d3a\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " Apr 18 02:53:56.489597 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.489484 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8rfk\" (UniqueName: \"kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk\") pod \"52993123-c7d1-4911-9717-187be5046d3a\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " Apr 18 02:53:56.489597 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.489516 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle\") pod \"52993123-c7d1-4911-9717-187be5046d3a\" (UID: \"52993123-c7d1-4911-9717-187be5046d3a\") " Apr 18 02:53:56.490584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.490532 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle" (OuterVolumeSpecName: "bundle") pod "52993123-c7d1-4911-9717-187be5046d3a" (UID: "52993123-c7d1-4911-9717-187be5046d3a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:56.491589 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.491545 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk" (OuterVolumeSpecName: "kube-api-access-h8rfk") pod "52993123-c7d1-4911-9717-187be5046d3a" (UID: "52993123-c7d1-4911-9717-187be5046d3a"). InnerVolumeSpecName "kube-api-access-h8rfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:53:56.494325 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.494298 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util" (OuterVolumeSpecName: "util") pod "52993123-c7d1-4911-9717-187be5046d3a" (UID: "52993123-c7d1-4911-9717-187be5046d3a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:53:56.591664 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.591623 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8rfk\" (UniqueName: \"kubernetes.io/projected/52993123-c7d1-4911-9717-187be5046d3a-kube-api-access-h8rfk\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:56.591664 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.591660 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:56.591861 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.591719 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52993123-c7d1-4911-9717-187be5046d3a-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:53:56.793340 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.793307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.795909 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.795885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7c1882-395f-4300-aeaa-4c83728c6e2e-cert\") pod \"odh-model-controller-858dbf95b8-djdc9\" (UID: \"0f7c1882-395f-4300-aeaa-4c83728c6e2e\") " pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:56.802789 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.802757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" event={"ID":"9456592c-61cf-4a59-820e-7c061a014993","Type":"ContainerStarted","Data":"761f79320e01525e595368d5208916f17c7d9ac0e2925a51b8079b350599c975"} Apr 18 02:53:56.804368 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.804346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" event={"ID":"52993123-c7d1-4911-9717-187be5046d3a","Type":"ContainerDied","Data":"257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5"} Apr 18 02:53:56.804368 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.804371 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257e15c52c27949ea3fc90374975d2f9287222583697a2f9f82a3c902c85fba5" Apr 18 02:53:56.804539 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.804375 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nrjx7" Apr 18 02:53:56.830432 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.830388 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-79f769b49f-b22gm" podStartSLOduration=1.817926782 podStartE2EDuration="4.830374054s" podCreationTimestamp="2026-04-18 02:53:52 +0000 UTC" firstStartedPulling="2026-04-18 02:53:53.351763944 +0000 UTC m=+476.491615256" lastFinishedPulling="2026-04-18 02:53:56.364211201 +0000 UTC m=+479.504062528" observedRunningTime="2026-04-18 02:53:56.828025724 +0000 UTC m=+479.967877059" watchObservedRunningTime="2026-04-18 02:53:56.830374054 +0000 UTC m=+479.970225469" Apr 18 02:53:56.994720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:56.994680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:53:57.325133 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:57.325105 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-djdc9"] Apr 18 02:53:57.327233 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:53:57.327205 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7c1882_395f_4300_aeaa_4c83728c6e2e.slice/crio-b132ab81d5f5a1e9de1cb272997b9440901fe4675adb2a85c37d9e11257d7295 WatchSource:0}: Error finding container b132ab81d5f5a1e9de1cb272997b9440901fe4675adb2a85c37d9e11257d7295: Status 404 returned error can't find the container with id b132ab81d5f5a1e9de1cb272997b9440901fe4675adb2a85c37d9e11257d7295 Apr 18 02:53:57.809000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:53:57.808959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" event={"ID":"0f7c1882-395f-4300-aeaa-4c83728c6e2e","Type":"ContainerStarted","Data":"b132ab81d5f5a1e9de1cb272997b9440901fe4675adb2a85c37d9e11257d7295"} Apr 18 02:54:00.821132 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:00.821098 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f7c1882-395f-4300-aeaa-4c83728c6e2e" containerID="9a5520bce310d8a47df30eedbf0bac233be363a4efad04373c5db5ffa7e89a6a" exitCode=1 Apr 18 02:54:00.821498 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:00.821187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" event={"ID":"0f7c1882-395f-4300-aeaa-4c83728c6e2e","Type":"ContainerDied","Data":"9a5520bce310d8a47df30eedbf0bac233be363a4efad04373c5db5ffa7e89a6a"} Apr 18 02:54:00.821498 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:00.821358 2577 scope.go:117] "RemoveContainer" containerID="9a5520bce310d8a47df30eedbf0bac233be363a4efad04373c5db5ffa7e89a6a" Apr 18 02:54:01.826306 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:01.826266 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f7c1882-395f-4300-aeaa-4c83728c6e2e" containerID="921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b" exitCode=1 Apr 18 02:54:01.826774 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:01.826345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" event={"ID":"0f7c1882-395f-4300-aeaa-4c83728c6e2e","Type":"ContainerDied","Data":"921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b"} Apr 18 02:54:01.826774 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:01.826397 2577 scope.go:117] "RemoveContainer" containerID="9a5520bce310d8a47df30eedbf0bac233be363a4efad04373c5db5ffa7e89a6a" Apr 18 02:54:01.826774 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:01.826684 2577 scope.go:117] "RemoveContainer" containerID="921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b" Apr 18 02:54:01.826943 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:54:01.826904 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-djdc9_opendatahub(0f7c1882-395f-4300-aeaa-4c83728c6e2e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" podUID="0f7c1882-395f-4300-aeaa-4c83728c6e2e" Apr 18 02:54:02.769681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.769647 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-82x78"] Apr 18 02:54:02.770054 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770037 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="pull" Apr 18 02:54:02.770138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770057 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="pull" Apr 18 02:54:02.770138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770073 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="util" Apr 18 02:54:02.770138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770080 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="util" Apr 18 02:54:02.770138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770114 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="extract" Apr 18 02:54:02.770138 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770122 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="extract" Apr 18 02:54:02.770359 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.770209 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="52993123-c7d1-4911-9717-187be5046d3a" containerName="extract" Apr 18 02:54:02.774159 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.774139 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:02.776767 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.776745 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 18 02:54:02.776767 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.776762 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-bz9qx\"" Apr 18 02:54:02.780835 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.780807 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-82x78"] Apr 18 02:54:02.832237 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.832204 2577 scope.go:117] "RemoveContainer" containerID="921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b" Apr 18 02:54:02.832699 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:54:02.832478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-djdc9_opendatahub(0f7c1882-395f-4300-aeaa-4c83728c6e2e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" podUID="0f7c1882-395f-4300-aeaa-4c83728c6e2e" Apr 18 02:54:02.945476 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.945434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:02.946433 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:02.946384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvtq\" (UniqueName: \"kubernetes.io/projected/b1c32584-1033-4e0f-a731-86a24df69910-kube-api-access-6dvtq\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.047806 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.047691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.047806 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.047777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvtq\" (UniqueName: \"kubernetes.io/projected/b1c32584-1033-4e0f-a731-86a24df69910-kube-api-access-6dvtq\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.048019 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:54:03.047864 2577 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 18 02:54:03.048019 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:54:03.047957 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert podName:b1c32584-1033-4e0f-a731-86a24df69910 nodeName:}" failed. No retries permitted until 2026-04-18 02:54:03.547936515 +0000 UTC m=+486.687787832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert") pod "kserve-controller-manager-856948b99f-82x78" (UID: "b1c32584-1033-4e0f-a731-86a24df69910") : secret "kserve-webhook-server-cert" not found Apr 18 02:54:03.056364 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.056329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvtq\" (UniqueName: \"kubernetes.io/projected/b1c32584-1033-4e0f-a731-86a24df69910-kube-api-access-6dvtq\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.551403 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.551355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.553746 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.553725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1c32584-1033-4e0f-a731-86a24df69910-cert\") pod \"kserve-controller-manager-856948b99f-82x78\" (UID: \"b1c32584-1033-4e0f-a731-86a24df69910\") " pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.685961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.685922 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:03.804774 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.804743 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-82x78"] Apr 18 02:54:03.806905 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:54:03.806874 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c32584_1033_4e0f_a731_86a24df69910.slice/crio-40e24be67b0499e71e8f51498bce6b515498a94b50dc44a56fd5e40d4fbfcc6f WatchSource:0}: Error finding container 40e24be67b0499e71e8f51498bce6b515498a94b50dc44a56fd5e40d4fbfcc6f: Status 404 returned error can't find the container with id 40e24be67b0499e71e8f51498bce6b515498a94b50dc44a56fd5e40d4fbfcc6f Apr 18 02:54:03.836849 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:03.836818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" event={"ID":"b1c32584-1033-4e0f-a731-86a24df69910","Type":"ContainerStarted","Data":"40e24be67b0499e71e8f51498bce6b515498a94b50dc44a56fd5e40d4fbfcc6f"} Apr 18 02:54:05.388322 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.388286 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq"] Apr 18 02:54:05.392060 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.392035 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.400242 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.400210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dj4h2\"" Apr 18 02:54:05.400242 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.400225 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 18 02:54:05.401166 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.401148 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 18 02:54:05.408061 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.408032 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq"] Apr 18 02:54:05.570703 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.570663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5q8\" (UniqueName: \"kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.570703 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.570708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.570961 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.570864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.672248 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.672157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.672248 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.672214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5q8\" (UniqueName: \"kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.672248 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.672241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.672637 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.672611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.672637 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.672627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.690499 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.690467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5q8\" (UniqueName: \"kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.703146 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.703114 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:05.850088 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:05.850055 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq"] Apr 18 02:54:06.270353 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:54:06.270319 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb416ba94_9393_4327_9083_ea5a5cb3bc1a.slice/crio-fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7 WatchSource:0}: Error finding container fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7: Status 404 returned error can't find the container with id fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7 Apr 18 02:54:06.683212 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.683131 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp"] Apr 18 02:54:06.686471 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.686454 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.689283 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.689259 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 18 02:54:06.689395 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.689301 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 18 02:54:06.689395 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.689266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-d5zpg\"" Apr 18 02:54:06.701689 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.701664 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp"] Apr 18 02:54:06.780647 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.780609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-598jz\" (UniqueName: \"kubernetes.io/projected/b0747d85-6a38-47a0-bd2b-1f21597229db-kube-api-access-598jz\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.780816 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.780658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b0747d85-6a38-47a0-bd2b-1f21597229db-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.849735 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.849694 2577 generic.go:358] "Generic (PLEG): container finished" podID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerID="91c1b2ff2b9bdcb95b047d888618ad5421ba7e99f4a1c9a8e17c56450ff504c0" exitCode=0 Apr 18 02:54:06.849900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.849754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" event={"ID":"b416ba94-9393-4327-9083-ea5a5cb3bc1a","Type":"ContainerDied","Data":"91c1b2ff2b9bdcb95b047d888618ad5421ba7e99f4a1c9a8e17c56450ff504c0"} Apr 18 02:54:06.849900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.849792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" event={"ID":"b416ba94-9393-4327-9083-ea5a5cb3bc1a","Type":"ContainerStarted","Data":"fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7"} Apr 18 02:54:06.851199 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.851139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" event={"ID":"b1c32584-1033-4e0f-a731-86a24df69910","Type":"ContainerStarted","Data":"939740317e9953ef150ddc30e4dc147a834c7c9eaa8740f212a88df1e3f3f1b9"} Apr 18 02:54:06.851383 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.851357 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:54:06.881766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.881729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b0747d85-6a38-47a0-bd2b-1f21597229db-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.881922 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.881835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-598jz\" (UniqueName: \"kubernetes.io/projected/b0747d85-6a38-47a0-bd2b-1f21597229db-kube-api-access-598jz\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.884379 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.884356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/b0747d85-6a38-47a0-bd2b-1f21597229db-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.902155 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.902121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-598jz\" (UniqueName: \"kubernetes.io/projected/b0747d85-6a38-47a0-bd2b-1f21597229db-kube-api-access-598jz\") pod \"servicemesh-operator3-55f49c5f94-8n2cp\" (UID: \"b0747d85-6a38-47a0-bd2b-1f21597229db\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:06.906306 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.906252 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" podStartSLOduration=2.391682928 podStartE2EDuration="4.906234201s" podCreationTimestamp="2026-04-18 02:54:02 +0000 UTC" firstStartedPulling="2026-04-18 02:54:03.808429937 +0000 UTC m=+486.948281252" lastFinishedPulling="2026-04-18 02:54:06.322981209 +0000 UTC m=+489.462832525" observedRunningTime="2026-04-18 02:54:06.905743104 +0000 UTC m=+490.045594438" watchObservedRunningTime="2026-04-18 02:54:06.906234201 +0000 UTC m=+490.046085537" Apr 18 02:54:06.995399 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.995357 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:54:06.995783 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:06.995766 2577 scope.go:117] "RemoveContainer" containerID="921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b" Apr 18 02:54:06.995968 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:54:06.995949 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-djdc9_opendatahub(0f7c1882-395f-4300-aeaa-4c83728c6e2e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" podUID="0f7c1882-395f-4300-aeaa-4c83728c6e2e" Apr 18 02:54:07.009865 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:07.009837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:07.149906 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:07.149868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp"] Apr 18 02:54:07.153828 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:54:07.153796 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0747d85_6a38_47a0_bd2b_1f21597229db.slice/crio-a3f3dc027f43c95edd6e6ab6b12091bfc99e3f189dbbc292782aebe88a72a95b WatchSource:0}: Error finding container a3f3dc027f43c95edd6e6ab6b12091bfc99e3f189dbbc292782aebe88a72a95b: Status 404 returned error can't find the container with id a3f3dc027f43c95edd6e6ab6b12091bfc99e3f189dbbc292782aebe88a72a95b Apr 18 02:54:07.864365 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:07.864266 2577 generic.go:358] "Generic (PLEG): container finished" podID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerID="caddac920b3574a101e17237b97244ad27e4a2dab63948bdc19f2143832e01d4" exitCode=0 Apr 18 02:54:07.864365 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:07.864343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" event={"ID":"b416ba94-9393-4327-9083-ea5a5cb3bc1a","Type":"ContainerDied","Data":"caddac920b3574a101e17237b97244ad27e4a2dab63948bdc19f2143832e01d4"} Apr 18 02:54:07.866932 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:07.866893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" event={"ID":"b0747d85-6a38-47a0-bd2b-1f21597229db","Type":"ContainerStarted","Data":"a3f3dc027f43c95edd6e6ab6b12091bfc99e3f189dbbc292782aebe88a72a95b"} Apr 18 02:54:08.877591 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:08.877532 2577 generic.go:358] "Generic (PLEG): container finished" podID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerID="e77b33e13d82e6849bc32038655a0d3b9fb2ac0f756c6eafa1103fd6d19cc9ea" exitCode=0 Apr 18 02:54:08.878038 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:08.877607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" event={"ID":"b416ba94-9393-4327-9083-ea5a5cb3bc1a","Type":"ContainerDied","Data":"e77b33e13d82e6849bc32038655a0d3b9fb2ac0f756c6eafa1103fd6d19cc9ea"} Apr 18 02:54:10.030906 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.030883 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:10.211000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.210896 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle\") pod \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " Apr 18 02:54:10.211000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.210949 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt5q8\" (UniqueName: \"kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8\") pod \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " Apr 18 02:54:10.211000 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.210982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util\") pod \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\" (UID: \"b416ba94-9393-4327-9083-ea5a5cb3bc1a\") " Apr 18 02:54:10.212350 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.212316 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle" (OuterVolumeSpecName: "bundle") pod "b416ba94-9393-4327-9083-ea5a5cb3bc1a" (UID: "b416ba94-9393-4327-9083-ea5a5cb3bc1a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:10.213911 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.213872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8" (OuterVolumeSpecName: "kube-api-access-lt5q8") pod "b416ba94-9393-4327-9083-ea5a5cb3bc1a" (UID: "b416ba94-9393-4327-9083-ea5a5cb3bc1a"). InnerVolumeSpecName "kube-api-access-lt5q8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:54:10.218462 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.218432 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util" (OuterVolumeSpecName: "util") pod "b416ba94-9393-4327-9083-ea5a5cb3bc1a" (UID: "b416ba94-9393-4327-9083-ea5a5cb3bc1a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:10.312474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.312434 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:54:10.312474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.312468 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lt5q8\" (UniqueName: \"kubernetes.io/projected/b416ba94-9393-4327-9083-ea5a5cb3bc1a-kube-api-access-lt5q8\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:54:10.312474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.312477 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b416ba94-9393-4327-9083-ea5a5cb3bc1a-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:54:10.889622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.889586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" event={"ID":"b416ba94-9393-4327-9083-ea5a5cb3bc1a","Type":"ContainerDied","Data":"fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7"} Apr 18 02:54:10.889622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.889622 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd37b235f6d7f92a2cf9b5349dc8746dff3d9d41593d57ab7e325e3b96fe9b7" Apr 18 02:54:10.889875 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.889634 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fwfxq" Apr 18 02:54:10.891115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.891087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" event={"ID":"b0747d85-6a38-47a0-bd2b-1f21597229db","Type":"ContainerStarted","Data":"bdba296a6430fdd303493f77eb98a167698d6e87f1a2caafd57c22dfc6989828"} Apr 18 02:54:10.891299 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.891280 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:10.913039 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:10.912984 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" podStartSLOduration=2.124354984 podStartE2EDuration="4.912968058s" podCreationTimestamp="2026-04-18 02:54:06 +0000 UTC" firstStartedPulling="2026-04-18 02:54:07.156165425 +0000 UTC m=+490.296016737" lastFinishedPulling="2026-04-18 02:54:09.944778497 +0000 UTC m=+493.084629811" observedRunningTime="2026-04-18 02:54:10.910880062 +0000 UTC m=+494.050731404" watchObservedRunningTime="2026-04-18 02:54:10.912968058 +0000 UTC m=+494.052819397" Apr 18 02:54:12.558384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558345 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5"] Apr 18 02:54:12.558834 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558818 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="pull" Apr 18 02:54:12.558903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558838 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="pull" Apr 18 02:54:12.558903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558865 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="util" Apr 18 02:54:12.558903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558874 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="util" Apr 18 02:54:12.558903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558894 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="extract" Apr 18 02:54:12.558903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558902 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="extract" Apr 18 02:54:12.559064 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.558975 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b416ba94-9393-4327-9083-ea5a5cb3bc1a" containerName="extract" Apr 18 02:54:12.563469 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.563449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.565683 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.565660 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 18 02:54:12.565788 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.565742 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 18 02:54:12.565788 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.565779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-v845q\"" Apr 18 02:54:12.566060 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.566041 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 18 02:54:12.566243 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.566229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 18 02:54:12.573895 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.573869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5"] Apr 18 02:54:12.628737 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.628737 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.628980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnf7h\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-kube-api-access-lnf7h\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.628980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.628980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.628980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.628967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.629140 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.629018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.729821 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.729784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnf7h\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-kube-api-access-lnf7h\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.729821 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.729824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.729859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.729879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.729902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.730043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730355 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.730103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.730781 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.730760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.732469 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.732450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.732469 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.732458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.732629 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.732572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.732629 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.732604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.738349 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.738326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.738430 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.738334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnf7h\" (UniqueName: \"kubernetes.io/projected/2f2f0e0a-1564-4e11-9483-0870a4b1f8f2-kube-api-access-lnf7h\") pod \"istiod-openshift-gateway-55ff986f96-mnlc5\" (UID: \"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:12.874851 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:12.874763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:13.007399 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:13.007322 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5"] Apr 18 02:54:13.908825 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:13.908782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" event={"ID":"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2","Type":"ContainerStarted","Data":"54fbbbc4a636d55399dba2e2a96efc0329b91a9ac8157f43b4c8f1080d48e48c"} Apr 18 02:54:15.879237 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:15.879197 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:54:15.879526 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:15.879265 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:54:16.704838 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.704804 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh"] Apr 18 02:54:16.708314 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.708290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.710742 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.710714 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-br2rn\"" Apr 18 02:54:16.717771 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.716804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh"] Apr 18 02:54:16.759632 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759632 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759879 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759879 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1062f7de-5f7a-4797-a63a-e6799379b8fc-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759879 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759879 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.759879 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.760143 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wxb5\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-kube-api-access-9wxb5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.760143 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.759928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1062f7de-5f7a-4797-a63a-e6799379b8fc-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wxb5\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-kube-api-access-9wxb5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.860991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861270 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.861266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1062f7de-5f7a-4797-a63a-e6799379b8fc-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861958 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.861657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.861958 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.861869 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.862098 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.862077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.864799 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.864732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.864940 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.864919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.869167 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.869140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.869321 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.869302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wxb5\" (UniqueName: \"kubernetes.io/projected/1062f7de-5f7a-4797-a63a-e6799379b8fc-kube-api-access-9wxb5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fm44jh\" (UID: \"1062f7de-5f7a-4797-a63a-e6799379b8fc\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:16.920728 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.920691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" event={"ID":"2f2f0e0a-1564-4e11-9483-0870a4b1f8f2","Type":"ContainerStarted","Data":"f8b92f4b6bf6a1392117ab11816f27dfcc2b8bee6b191efb97a1f06042a6135b"} Apr 18 02:54:16.921181 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.920894 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:16.922761 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.922740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" Apr 18 02:54:16.955503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.955387 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mnlc5" podStartSLOduration=2.091672587 podStartE2EDuration="4.955362017s" podCreationTimestamp="2026-04-18 02:54:12 +0000 UTC" firstStartedPulling="2026-04-18 02:54:13.015261388 +0000 UTC m=+496.155112703" lastFinishedPulling="2026-04-18 02:54:15.878950817 +0000 UTC m=+499.018802133" observedRunningTime="2026-04-18 02:54:16.93830177 +0000 UTC m=+500.078153118" watchObservedRunningTime="2026-04-18 02:54:16.955362017 +0000 UTC m=+500.095213352" Apr 18 02:54:16.995058 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.995017 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:54:16.995611 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:16.995532 2577 scope.go:117] "RemoveContainer" containerID="921663f7f978c4ac9badf818c74b2f3ec6e62bec9103553a5e23569c17b4552b" Apr 18 02:54:17.021781 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.021752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:17.162271 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.162240 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh"] Apr 18 02:54:17.164725 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:54:17.164687 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1062f7de_5f7a_4797_a63a_e6799379b8fc.slice/crio-e227d6bf5ce124bc37ca94ecec10aed2dd987b1f0985894105cc49bd8821420c WatchSource:0}: Error finding container e227d6bf5ce124bc37ca94ecec10aed2dd987b1f0985894105cc49bd8821420c: Status 404 returned error can't find the container with id e227d6bf5ce124bc37ca94ecec10aed2dd987b1f0985894105cc49bd8821420c Apr 18 02:54:17.924911 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.924871 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" event={"ID":"1062f7de-5f7a-4797-a63a-e6799379b8fc","Type":"ContainerStarted","Data":"e227d6bf5ce124bc37ca94ecec10aed2dd987b1f0985894105cc49bd8821420c"} Apr 18 02:54:17.926577 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.926513 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" event={"ID":"0f7c1882-395f-4300-aeaa-4c83728c6e2e","Type":"ContainerStarted","Data":"40b065372de9e234ce96383e3aa3d8ca2b2e84f45d57d258a21065a890252d9f"} Apr 18 02:54:17.926764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.926746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:54:17.950463 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:17.950397 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" podStartSLOduration=1.998313676 podStartE2EDuration="21.950383642s" podCreationTimestamp="2026-04-18 02:53:56 +0000 UTC" firstStartedPulling="2026-04-18 02:53:57.32872623 +0000 UTC m=+480.468577543" lastFinishedPulling="2026-04-18 02:54:17.280796179 +0000 UTC m=+500.420647509" observedRunningTime="2026-04-18 02:54:17.947755765 +0000 UTC m=+501.087607102" watchObservedRunningTime="2026-04-18 02:54:17.950383642 +0000 UTC m=+501.090234977" Apr 18 02:54:19.909170 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:19.909129 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:54:19.909443 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:19.909246 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:54:19.909443 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:19.909288 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:54:20.943072 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:20.943037 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" event={"ID":"1062f7de-5f7a-4797-a63a-e6799379b8fc","Type":"ContainerStarted","Data":"af90df524531113f1b2856e32e19ba1f06168bb415ee8b30ed513d2cf64dbd6d"} Apr 18 02:54:20.964928 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:20.964875 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" podStartSLOduration=2.222780281 podStartE2EDuration="4.964861467s" podCreationTimestamp="2026-04-18 02:54:16 +0000 UTC" firstStartedPulling="2026-04-18 02:54:17.166787299 +0000 UTC m=+500.306638616" lastFinishedPulling="2026-04-18 02:54:19.908868479 +0000 UTC m=+503.048719802" observedRunningTime="2026-04-18 02:54:20.961828952 +0000 UTC m=+504.101680290" watchObservedRunningTime="2026-04-18 02:54:20.964861467 +0000 UTC m=+504.104712802" Apr 18 02:54:21.022266 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:21.022234 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:21.027350 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:21.027323 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:21.896990 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:21.896960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8n2cp" Apr 18 02:54:21.947344 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:21.947311 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:21.948275 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:21.948255 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fm44jh" Apr 18 02:54:28.935495 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:28.935465 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-djdc9" Apr 18 02:54:37.872264 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:54:37.872233 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-82x78" Apr 18 02:55:07.477603 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.477537 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb"] Apr 18 02:55:07.488579 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.488525 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb"] Apr 18 02:55:07.488754 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.488685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.491352 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.491329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 18 02:55:07.492469 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.492441 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b57hm\"" Apr 18 02:55:07.492601 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.492443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 18 02:55:07.602335 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.602295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4kz\" (UniqueName: \"kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.602516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.602440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.602516 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.602471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.703838 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.703793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4kz\" (UniqueName: \"kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.704010 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.703902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.704010 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.703919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.704312 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.704291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.704353 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.704315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.712082 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.712049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4kz\" (UniqueName: \"kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.808403 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.808362 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:07.936475 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:07.936445 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb"] Apr 18 02:55:07.939138 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:07.939111 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225bddc0_d5f4_4ba2_8605_d8b5a5825f9a.slice/crio-fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd WatchSource:0}: Error finding container fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd: Status 404 returned error can't find the container with id fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd Apr 18 02:55:08.075413 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.075326 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z"] Apr 18 02:55:08.078860 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.078844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.085469 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.085439 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z"] Apr 18 02:55:08.114013 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.113983 2577 generic.go:358] "Generic (PLEG): container finished" podID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerID="db0ee824b69ab1998dbfbe6fdc8ed27d43aa72dac03784bdc4f37ced43353150" exitCode=0 Apr 18 02:55:08.114147 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.114063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" event={"ID":"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a","Type":"ContainerDied","Data":"db0ee824b69ab1998dbfbe6fdc8ed27d43aa72dac03784bdc4f37ced43353150"} Apr 18 02:55:08.114147 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.114105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" event={"ID":"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a","Type":"ContainerStarted","Data":"fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd"} Apr 18 02:55:08.208051 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.208017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scbsl\" (UniqueName: \"kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.208208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.208064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.208254 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.208214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.309517 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.309481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scbsl\" (UniqueName: \"kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.309732 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.309527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.309732 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.309662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.309927 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.309906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.310098 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.310083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.318538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.318514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scbsl\" (UniqueName: \"kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.412722 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.412637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:08.534355 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.534328 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z"] Apr 18 02:55:08.535914 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:08.535890 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee130515_6cbc_4c3b_aa7b_b4df838191b7.slice/crio-02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df WatchSource:0}: Error finding container 02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df: Status 404 returned error can't find the container with id 02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df Apr 18 02:55:08.680190 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.680109 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg"] Apr 18 02:55:08.683639 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.683618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.690229 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.690203 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg"] Apr 18 02:55:08.814973 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.814942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.815132 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.814990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.815132 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.815089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv8jl\" (UniqueName: \"kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.916064 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.915971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.916064 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.916015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.916203 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.916087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv8jl\" (UniqueName: \"kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.916402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.916380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.916467 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.916440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:08.923473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:08.923450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv8jl\" (UniqueName: \"kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:09.002487 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.002451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:09.085077 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.085048 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r"] Apr 18 02:55:09.089736 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.089718 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.095201 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.095178 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r"] Apr 18 02:55:09.119634 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.119602 2577 generic.go:358] "Generic (PLEG): container finished" podID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerID="7abf0726c9fb98e32cd13e924ffc09785b10b0d012854a08a2028231a63c389f" exitCode=0 Apr 18 02:55:09.119804 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.119643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" event={"ID":"ee130515-6cbc-4c3b-aa7b-b4df838191b7","Type":"ContainerDied","Data":"7abf0726c9fb98e32cd13e924ffc09785b10b0d012854a08a2028231a63c389f"} Apr 18 02:55:09.119804 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.119674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" event={"ID":"ee130515-6cbc-4c3b-aa7b-b4df838191b7","Type":"ContainerStarted","Data":"02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df"} Apr 18 02:55:09.121544 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.121522 2577 generic.go:358] "Generic (PLEG): container finished" podID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerID="7d5106f036e8b06f51c69f58278dfe7e1bde2420c8564aedce0af7b9d176403b" exitCode=0 Apr 18 02:55:09.121695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.121581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" event={"ID":"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a","Type":"ContainerDied","Data":"7d5106f036e8b06f51c69f58278dfe7e1bde2420c8564aedce0af7b9d176403b"} Apr 18 02:55:09.129937 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.129877 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg"] Apr 18 02:55:09.133269 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:09.133244 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2527da1f_3774_46d7_8048_e176c1e9e774.slice/crio-0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568 WatchSource:0}: Error finding container 0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568: Status 404 returned error can't find the container with id 0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568 Apr 18 02:55:09.219121 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.219074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.219301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.219131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vq8\" (UniqueName: \"kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.219301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.219166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.320605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.320541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.320605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.320610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vq8\" (UniqueName: \"kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.320786 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.320636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.320944 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.320924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.320980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.320943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.328634 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.328616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vq8\" (UniqueName: \"kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.401393 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.401346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:09.522603 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:09.522571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r"] Apr 18 02:55:09.524379 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:09.524315 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889e9332_e348_4ec0_8b9e_c4160a045d91.slice/crio-042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932 WatchSource:0}: Error finding container 042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932: Status 404 returned error can't find the container with id 042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932 Apr 18 02:55:10.127208 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.127169 2577 generic.go:358] "Generic (PLEG): container finished" podID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerID="ec6c05437a2b1075bd1e2a4c4bcb4e790fb425b2ba8f7abcdb7bcf79c06b6e8a" exitCode=0 Apr 18 02:55:10.127689 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.127247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" event={"ID":"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a","Type":"ContainerDied","Data":"ec6c05437a2b1075bd1e2a4c4bcb4e790fb425b2ba8f7abcdb7bcf79c06b6e8a"} Apr 18 02:55:10.128474 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.128455 2577 generic.go:358] "Generic (PLEG): container finished" podID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerID="9004de8df17d6f90fc2b0c5d59284af71351e46dd0c213351163c06feae69e16" exitCode=0 Apr 18 02:55:10.128576 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.128511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" event={"ID":"889e9332-e348-4ec0-8b9e-c4160a045d91","Type":"ContainerDied","Data":"9004de8df17d6f90fc2b0c5d59284af71351e46dd0c213351163c06feae69e16"} Apr 18 02:55:10.128576 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.128529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" event={"ID":"889e9332-e348-4ec0-8b9e-c4160a045d91","Type":"ContainerStarted","Data":"042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932"} Apr 18 02:55:10.129769 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.129748 2577 generic.go:358] "Generic (PLEG): container finished" podID="2527da1f-3774-46d7-8048-e176c1e9e774" containerID="0401d840343486170cfb247516a9560868d2554ae7a06228132a823bb7181144" exitCode=0 Apr 18 02:55:10.129877 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.129819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" event={"ID":"2527da1f-3774-46d7-8048-e176c1e9e774","Type":"ContainerDied","Data":"0401d840343486170cfb247516a9560868d2554ae7a06228132a823bb7181144"} Apr 18 02:55:10.129877 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.129845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" event={"ID":"2527da1f-3774-46d7-8048-e176c1e9e774","Type":"ContainerStarted","Data":"0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568"} Apr 18 02:55:10.131681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.131661 2577 generic.go:358] "Generic (PLEG): container finished" podID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerID="8f7e1888cff4627396331c9bcef07cbefdb0c2f421da7a4ff6ba63a231233cf9" exitCode=0 Apr 18 02:55:10.131764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:10.131692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" event={"ID":"ee130515-6cbc-4c3b-aa7b-b4df838191b7","Type":"ContainerDied","Data":"8f7e1888cff4627396331c9bcef07cbefdb0c2f421da7a4ff6ba63a231233cf9"} Apr 18 02:55:11.138082 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.137993 2577 generic.go:358] "Generic (PLEG): container finished" podID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerID="92bfd6b8e1da8bed0b4ca06a5b061053b775a7980d53387f791a588ad9a12f2e" exitCode=0 Apr 18 02:55:11.138486 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.138081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" event={"ID":"ee130515-6cbc-4c3b-aa7b-b4df838191b7","Type":"ContainerDied","Data":"92bfd6b8e1da8bed0b4ca06a5b061053b775a7980d53387f791a588ad9a12f2e"} Apr 18 02:55:11.139729 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.139705 2577 generic.go:358] "Generic (PLEG): container finished" podID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerID="de35da93821165dc4c151f4dca08205882e3dca07e9b5f6a0d8367218efd9db0" exitCode=0 Apr 18 02:55:11.139842 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.139764 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" event={"ID":"889e9332-e348-4ec0-8b9e-c4160a045d91","Type":"ContainerDied","Data":"de35da93821165dc4c151f4dca08205882e3dca07e9b5f6a0d8367218efd9db0"} Apr 18 02:55:11.141486 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.141396 2577 generic.go:358] "Generic (PLEG): container finished" podID="2527da1f-3774-46d7-8048-e176c1e9e774" containerID="c0ae9b3fd633279884265938bd04f86f268d374a194a50ab05645007ed33d789" exitCode=0 Apr 18 02:55:11.141486 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.141472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" event={"ID":"2527da1f-3774-46d7-8048-e176c1e9e774","Type":"ContainerDied","Data":"c0ae9b3fd633279884265938bd04f86f268d374a194a50ab05645007ed33d789"} Apr 18 02:55:11.263650 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.263543 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:11.437089 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.436994 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle\") pod \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " Apr 18 02:55:11.437089 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.437050 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util\") pod \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " Apr 18 02:55:11.437282 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.437095 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb4kz\" (UniqueName: \"kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz\") pod \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\" (UID: \"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a\") " Apr 18 02:55:11.437801 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.437774 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle" (OuterVolumeSpecName: "bundle") pod "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" (UID: "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:11.439373 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.439339 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz" (OuterVolumeSpecName: "kube-api-access-kb4kz") pod "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" (UID: "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a"). InnerVolumeSpecName "kube-api-access-kb4kz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:11.442651 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.442627 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util" (OuterVolumeSpecName: "util") pod "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" (UID: "225bddc0-d5f4-4ba2-8605-d8b5a5825f9a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:11.538050 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.538001 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:11.538050 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.538044 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:11.538050 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:11.538054 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb4kz\" (UniqueName: \"kubernetes.io/projected/225bddc0-d5f4-4ba2-8605-d8b5a5825f9a-kube-api-access-kb4kz\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:12.152068 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.151972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" event={"ID":"225bddc0-d5f4-4ba2-8605-d8b5a5825f9a","Type":"ContainerDied","Data":"fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd"} Apr 18 02:55:12.152068 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.152013 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6f4e89e6ddb80e38273da97dcd16e8be6d6a5c863630012c385a85c6aa26cd" Apr 18 02:55:12.152068 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.152016 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb" Apr 18 02:55:12.153903 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.153877 2577 generic.go:358] "Generic (PLEG): container finished" podID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerID="1b08d8496525816beea7695926c08362e5f5b78ba249389fd993970019a54a4a" exitCode=0 Apr 18 02:55:12.154019 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.153941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" event={"ID":"889e9332-e348-4ec0-8b9e-c4160a045d91","Type":"ContainerDied","Data":"1b08d8496525816beea7695926c08362e5f5b78ba249389fd993970019a54a4a"} Apr 18 02:55:12.155696 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.155674 2577 generic.go:358] "Generic (PLEG): container finished" podID="2527da1f-3774-46d7-8048-e176c1e9e774" containerID="1d16c65319fd67f0bcb45dfb04511e734faa62d2d03410188406c6267ae34698" exitCode=0 Apr 18 02:55:12.155809 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.155793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" event={"ID":"2527da1f-3774-46d7-8048-e176c1e9e774","Type":"ContainerDied","Data":"1d16c65319fd67f0bcb45dfb04511e734faa62d2d03410188406c6267ae34698"} Apr 18 02:55:12.280764 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.280743 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:12.444599 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.444494 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scbsl\" (UniqueName: \"kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl\") pod \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " Apr 18 02:55:12.444599 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.444567 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle\") pod \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " Apr 18 02:55:12.444599 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.444591 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util\") pod \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\" (UID: \"ee130515-6cbc-4c3b-aa7b-b4df838191b7\") " Apr 18 02:55:12.445122 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.445095 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle" (OuterVolumeSpecName: "bundle") pod "ee130515-6cbc-4c3b-aa7b-b4df838191b7" (UID: "ee130515-6cbc-4c3b-aa7b-b4df838191b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:12.446539 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.446515 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl" (OuterVolumeSpecName: "kube-api-access-scbsl") pod "ee130515-6cbc-4c3b-aa7b-b4df838191b7" (UID: "ee130515-6cbc-4c3b-aa7b-b4df838191b7"). InnerVolumeSpecName "kube-api-access-scbsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:12.451123 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.451079 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util" (OuterVolumeSpecName: "util") pod "ee130515-6cbc-4c3b-aa7b-b4df838191b7" (UID: "ee130515-6cbc-4c3b-aa7b-b4df838191b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:12.545960 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.545920 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-scbsl\" (UniqueName: \"kubernetes.io/projected/ee130515-6cbc-4c3b-aa7b-b4df838191b7-kube-api-access-scbsl\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:12.545960 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.545958 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:12.545960 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:12.545969 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee130515-6cbc-4c3b-aa7b-b4df838191b7-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.161207 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.161172 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" Apr 18 02:55:13.161207 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.161181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z" event={"ID":"ee130515-6cbc-4c3b-aa7b-b4df838191b7","Type":"ContainerDied","Data":"02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df"} Apr 18 02:55:13.161207 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.161213 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ae2990af47b315ce85cd38e54d75852aa00bc54b8d8e0a1d524a86c98812df" Apr 18 02:55:13.288223 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.288201 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:13.316627 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.316607 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:13.452188 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452109 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util\") pod \"889e9332-e348-4ec0-8b9e-c4160a045d91\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " Apr 18 02:55:13.452188 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452167 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7vq8\" (UniqueName: \"kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8\") pod \"889e9332-e348-4ec0-8b9e-c4160a045d91\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " Apr 18 02:55:13.452362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452228 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle\") pod \"889e9332-e348-4ec0-8b9e-c4160a045d91\" (UID: \"889e9332-e348-4ec0-8b9e-c4160a045d91\") " Apr 18 02:55:13.452362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452255 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util\") pod \"2527da1f-3774-46d7-8048-e176c1e9e774\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " Apr 18 02:55:13.452362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452311 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv8jl\" (UniqueName: \"kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl\") pod \"2527da1f-3774-46d7-8048-e176c1e9e774\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " Apr 18 02:55:13.452362 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452349 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle\") pod \"2527da1f-3774-46d7-8048-e176c1e9e774\" (UID: \"2527da1f-3774-46d7-8048-e176c1e9e774\") " Apr 18 02:55:13.452948 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.452910 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle" (OuterVolumeSpecName: "bundle") pod "2527da1f-3774-46d7-8048-e176c1e9e774" (UID: "2527da1f-3774-46d7-8048-e176c1e9e774"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:13.453079 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.453000 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle" (OuterVolumeSpecName: "bundle") pod "889e9332-e348-4ec0-8b9e-c4160a045d91" (UID: "889e9332-e348-4ec0-8b9e-c4160a045d91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:13.454349 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.454320 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl" (OuterVolumeSpecName: "kube-api-access-xv8jl") pod "2527da1f-3774-46d7-8048-e176c1e9e774" (UID: "2527da1f-3774-46d7-8048-e176c1e9e774"). InnerVolumeSpecName "kube-api-access-xv8jl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:13.454720 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.454689 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8" (OuterVolumeSpecName: "kube-api-access-m7vq8") pod "889e9332-e348-4ec0-8b9e-c4160a045d91" (UID: "889e9332-e348-4ec0-8b9e-c4160a045d91"). InnerVolumeSpecName "kube-api-access-m7vq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:13.457840 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.457817 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util" (OuterVolumeSpecName: "util") pod "889e9332-e348-4ec0-8b9e-c4160a045d91" (UID: "889e9332-e348-4ec0-8b9e-c4160a045d91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:13.458253 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.458232 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util" (OuterVolumeSpecName: "util") pod "2527da1f-3774-46d7-8048-e176c1e9e774" (UID: "2527da1f-3774-46d7-8048-e176c1e9e774"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:13.554029 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.553995 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.554029 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.554026 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.554255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.554039 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7vq8\" (UniqueName: \"kubernetes.io/projected/889e9332-e348-4ec0-8b9e-c4160a045d91-kube-api-access-m7vq8\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.554255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.554053 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/889e9332-e348-4ec0-8b9e-c4160a045d91-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.554255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.554067 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2527da1f-3774-46d7-8048-e176c1e9e774-util\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:13.554255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:13.554078 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xv8jl\" (UniqueName: \"kubernetes.io/projected/2527da1f-3774-46d7-8048-e176c1e9e774-kube-api-access-xv8jl\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:55:14.166525 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.166488 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" event={"ID":"889e9332-e348-4ec0-8b9e-c4160a045d91","Type":"ContainerDied","Data":"042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932"} Apr 18 02:55:14.166525 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.166525 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="042091e242bed66e6a59df97c8995f4d7084434feed1625ae47804587e31c932" Apr 18 02:55:14.166525 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.166528 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r" Apr 18 02:55:14.168301 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.168282 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" Apr 18 02:55:14.168402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.168308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg" event={"ID":"2527da1f-3774-46d7-8048-e176c1e9e774","Type":"ContainerDied","Data":"0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568"} Apr 18 02:55:14.168402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:14.168336 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0990f203b04061158cc7443aef71ef740f852e47141aecccf594fbfd84d568" Apr 18 02:55:27.376808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.376773 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj"] Apr 18 02:55:27.377290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377258 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="util" Apr 18 02:55:27.377290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377279 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="util" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377294 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="util" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377303 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="util" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377314 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="pull" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377323 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="pull" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377335 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="util" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377344 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="util" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377358 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="pull" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377366 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="pull" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377380 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="extract" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377389 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="extract" Apr 18 02:55:27.377402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377401 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377409 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377419 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="pull" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377428 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="pull" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377442 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="pull" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377450 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="pull" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377469 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="util" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377478 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="util" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377486 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377495 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377508 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377516 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377632 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee130515-6cbc-4c3b-aa7b-b4df838191b7" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377649 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2527da1f-3774-46d7-8048-e176c1e9e774" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377662 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="225bddc0-d5f4-4ba2-8605-d8b5a5825f9a" containerName="extract" Apr 18 02:55:27.378099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.377673 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="889e9332-e348-4ec0-8b9e-c4160a045d91" containerName="extract" Apr 18 02:55:27.381309 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.381289 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:27.383772 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.383752 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 18 02:55:27.384658 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.384636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-pfpx9\"" Apr 18 02:55:27.384734 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.384644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 18 02:55:27.384734 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.384647 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 18 02:55:27.390104 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.390079 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj"] Apr 18 02:55:27.477948 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.477917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzrh\" (UniqueName: \"kubernetes.io/projected/4abf3863-834d-4961-bcd8-40633bf2747f-kube-api-access-zpzrh\") pod \"dns-operator-controller-manager-648d5c98bc-b86bj\" (UID: \"4abf3863-834d-4961-bcd8-40633bf2747f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:27.579407 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.579368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzrh\" (UniqueName: \"kubernetes.io/projected/4abf3863-834d-4961-bcd8-40633bf2747f-kube-api-access-zpzrh\") pod \"dns-operator-controller-manager-648d5c98bc-b86bj\" (UID: \"4abf3863-834d-4961-bcd8-40633bf2747f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:27.590229 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.590199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzrh\" (UniqueName: \"kubernetes.io/projected/4abf3863-834d-4961-bcd8-40633bf2747f-kube-api-access-zpzrh\") pod \"dns-operator-controller-manager-648d5c98bc-b86bj\" (UID: \"4abf3863-834d-4961-bcd8-40633bf2747f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:27.692616 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.692497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:27.824729 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:27.824689 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj"] Apr 18 02:55:27.826504 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:27.826479 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abf3863_834d_4961_bcd8_40633bf2747f.slice/crio-1789144bd4867f697662fb900559eb3a70efaec92d16f317956a49f63c2fd8a7 WatchSource:0}: Error finding container 1789144bd4867f697662fb900559eb3a70efaec92d16f317956a49f63c2fd8a7: Status 404 returned error can't find the container with id 1789144bd4867f697662fb900559eb3a70efaec92d16f317956a49f63c2fd8a7 Apr 18 02:55:28.221585 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:28.221534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" event={"ID":"4abf3863-834d-4961-bcd8-40633bf2747f","Type":"ContainerStarted","Data":"1789144bd4867f697662fb900559eb3a70efaec92d16f317956a49f63c2fd8a7"} Apr 18 02:55:30.651416 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.651385 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-847dcd4d56-24mgv"] Apr 18 02:55:30.655042 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.655015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.672499 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.672469 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847dcd4d56-24mgv"] Apr 18 02:55:30.708114 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-oauth-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-service-ca\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-trusted-ca-bundle\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-console-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708311 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-oauth-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.708504 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.708314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7nr5\" (UniqueName: \"kubernetes.io/projected/7d4485fc-583f-4051-97ef-17721830f3e0-kube-api-access-p7nr5\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.734468 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.734432 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dbbvs"] Apr 18 02:55:30.737866 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.737844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:30.740252 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.740229 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-2tzq5\"" Apr 18 02:55:30.745455 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.745210 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dbbvs"] Apr 18 02:55:30.809226 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809428 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-oauth-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809428 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-service-ca\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809428 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-trusted-ca-bundle\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809607 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-console-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809607 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-oauth-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809607 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7nr5\" (UniqueName: \"kubernetes.io/projected/7d4485fc-583f-4051-97ef-17721830f3e0-kube-api-access-p7nr5\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.809745 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.809634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9fzd\" (UniqueName: \"kubernetes.io/projected/16b1cf91-83fa-4379-93a6-1e808fa66a29-kube-api-access-b9fzd\") pod \"authorino-operator-657f44b778-dbbvs\" (UID: \"16b1cf91-83fa-4379-93a6-1e808fa66a29\") " pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:30.810061 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.810032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-service-ca\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.810389 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.810369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-console-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.810465 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.810452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-trusted-ca-bundle\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.810527 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.810502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4485fc-583f-4051-97ef-17721830f3e0-oauth-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.812300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.812280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-serving-cert\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.812384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.812297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4485fc-583f-4051-97ef-17721830f3e0-console-oauth-config\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.819906 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.819876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7nr5\" (UniqueName: \"kubernetes.io/projected/7d4485fc-583f-4051-97ef-17721830f3e0-kube-api-access-p7nr5\") pod \"console-847dcd4d56-24mgv\" (UID: \"7d4485fc-583f-4051-97ef-17721830f3e0\") " pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:30.911161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.911066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9fzd\" (UniqueName: \"kubernetes.io/projected/16b1cf91-83fa-4379-93a6-1e808fa66a29-kube-api-access-b9fzd\") pod \"authorino-operator-657f44b778-dbbvs\" (UID: \"16b1cf91-83fa-4379-93a6-1e808fa66a29\") " pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:30.920123 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.920092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9fzd\" (UniqueName: \"kubernetes.io/projected/16b1cf91-83fa-4379-93a6-1e808fa66a29-kube-api-access-b9fzd\") pod \"authorino-operator-657f44b778-dbbvs\" (UID: \"16b1cf91-83fa-4379-93a6-1e808fa66a29\") " pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:30.965994 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:30.965952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:31.049916 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.049879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:31.105945 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.105799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847dcd4d56-24mgv"] Apr 18 02:55:31.108733 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:31.108687 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4485fc_583f_4051_97ef_17721830f3e0.slice/crio-3d4f683ea296a7ad84f46e28d126b0e3aee5200855ee67ca864d88f204e70086 WatchSource:0}: Error finding container 3d4f683ea296a7ad84f46e28d126b0e3aee5200855ee67ca864d88f204e70086: Status 404 returned error can't find the container with id 3d4f683ea296a7ad84f46e28d126b0e3aee5200855ee67ca864d88f204e70086 Apr 18 02:55:31.194883 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.194858 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dbbvs"] Apr 18 02:55:31.196782 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:55:31.196758 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b1cf91_83fa_4379_93a6_1e808fa66a29.slice/crio-6a035d6a13fb60d07bd4cd3dde82bb6254f2bab80cb70cf98efa2b2d6aa229e3 WatchSource:0}: Error finding container 6a035d6a13fb60d07bd4cd3dde82bb6254f2bab80cb70cf98efa2b2d6aa229e3: Status 404 returned error can't find the container with id 6a035d6a13fb60d07bd4cd3dde82bb6254f2bab80cb70cf98efa2b2d6aa229e3 Apr 18 02:55:31.238996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.238960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" event={"ID":"4abf3863-834d-4961-bcd8-40633bf2747f","Type":"ContainerStarted","Data":"b0688ec0e889328b188fbf3ee22a3fa621a96defd1a5bcc8a3f20e35b39f2064"} Apr 18 02:55:31.239161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.239055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:31.240355 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.240329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" event={"ID":"16b1cf91-83fa-4379-93a6-1e808fa66a29","Type":"ContainerStarted","Data":"6a035d6a13fb60d07bd4cd3dde82bb6254f2bab80cb70cf98efa2b2d6aa229e3"} Apr 18 02:55:31.241980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.241957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847dcd4d56-24mgv" event={"ID":"7d4485fc-583f-4051-97ef-17721830f3e0","Type":"ContainerStarted","Data":"8a157cadac00388e4d30f6a406a4080af886b2a70d0076d04cd7a9c4e4dd5ce0"} Apr 18 02:55:31.242065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.241985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847dcd4d56-24mgv" event={"ID":"7d4485fc-583f-4051-97ef-17721830f3e0","Type":"ContainerStarted","Data":"3d4f683ea296a7ad84f46e28d126b0e3aee5200855ee67ca864d88f204e70086"} Apr 18 02:55:31.262639 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.262591 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" podStartSLOduration=1.883491386 podStartE2EDuration="4.262577644s" podCreationTimestamp="2026-04-18 02:55:27 +0000 UTC" firstStartedPulling="2026-04-18 02:55:27.828456015 +0000 UTC m=+570.968307332" lastFinishedPulling="2026-04-18 02:55:30.207542273 +0000 UTC m=+573.347393590" observedRunningTime="2026-04-18 02:55:31.259820901 +0000 UTC m=+574.399672239" watchObservedRunningTime="2026-04-18 02:55:31.262577644 +0000 UTC m=+574.402428975" Apr 18 02:55:31.288262 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:31.288208 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-847dcd4d56-24mgv" podStartSLOduration=1.288193634 podStartE2EDuration="1.288193634s" podCreationTimestamp="2026-04-18 02:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:55:31.286387423 +0000 UTC m=+574.426238759" watchObservedRunningTime="2026-04-18 02:55:31.288193634 +0000 UTC m=+574.428044968" Apr 18 02:55:33.251122 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:33.251088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" event={"ID":"16b1cf91-83fa-4379-93a6-1e808fa66a29","Type":"ContainerStarted","Data":"66a06edcf0c3a4a25d39defe184cab114cde37c5c4c284be96755535f7dd7227"} Apr 18 02:55:33.251501 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:33.251144 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:33.267327 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:33.267275 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" podStartSLOduration=1.802847639 podStartE2EDuration="3.26726136s" podCreationTimestamp="2026-04-18 02:55:30 +0000 UTC" firstStartedPulling="2026-04-18 02:55:31.199275694 +0000 UTC m=+574.339127008" lastFinishedPulling="2026-04-18 02:55:32.663689416 +0000 UTC m=+575.803540729" observedRunningTime="2026-04-18 02:55:33.265172819 +0000 UTC m=+576.405024179" watchObservedRunningTime="2026-04-18 02:55:33.26726136 +0000 UTC m=+576.407112695" Apr 18 02:55:40.966898 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:40.966864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:40.967437 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:40.967003 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:40.971766 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:40.971747 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:41.288461 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:41.288430 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-847dcd4d56-24mgv" Apr 18 02:55:41.335179 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:41.335144 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:55:42.248991 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:42.248961 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b86bj" Apr 18 02:55:44.256429 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:44.256398 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-dbbvs" Apr 18 02:55:57.298986 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:57.298949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:55:57.299450 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:57.299256 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 02:55:57.303476 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:57.303451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:55:57.303660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:55:57.303451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 02:56:06.358727 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.358671 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-99fbf494f-hl6gk" podUID="51c97bd2-5b45-441e-84a0-6c8527e6691b" containerName="console" containerID="cri-o://1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e" gracePeriod=15 Apr 18 02:56:06.608390 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.608365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-99fbf494f-hl6gk_51c97bd2-5b45-441e-84a0-6c8527e6691b/console/0.log" Apr 18 02:56:06.608530 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.608438 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:56:06.735124 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735093 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735135 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735162 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsdrv\" (UniqueName: \"kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735219 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735277 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735369 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert\") pod \"51c97bd2-5b45-441e-84a0-6c8527e6691b\" (UID: \"51c97bd2-5b45-441e-84a0-6c8527e6691b\") " Apr 18 02:56:06.735669 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735500 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config" (OuterVolumeSpecName: "console-config") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:56:06.735773 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735701 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca" (OuterVolumeSpecName: "service-ca") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:56:06.735773 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735733 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.735864 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735823 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:56:06.735900 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.735870 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:56:06.737462 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.737440 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:56:06.737918 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.737896 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:56:06.737975 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.737928 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv" (OuterVolumeSpecName: "kube-api-access-rsdrv") pod "51c97bd2-5b45-441e-84a0-6c8527e6691b" (UID: "51c97bd2-5b45-441e-84a0-6c8527e6691b"). InnerVolumeSpecName "kube-api-access-rsdrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:06.836473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836440 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.836473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836471 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-oauth-serving-cert\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.836690 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836483 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsdrv\" (UniqueName: \"kubernetes.io/projected/51c97bd2-5b45-441e-84a0-6c8527e6691b-kube-api-access-rsdrv\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.836690 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836497 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51c97bd2-5b45-441e-84a0-6c8527e6691b-console-oauth-config\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.836690 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836509 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-service-ca\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:06.836690 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:06.836521 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c97bd2-5b45-441e-84a0-6c8527e6691b-trusted-ca-bundle\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:07.382464 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-99fbf494f-hl6gk_51c97bd2-5b45-441e-84a0-6c8527e6691b/console/0.log" Apr 18 02:56:07.382926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382485 2577 generic.go:358] "Generic (PLEG): container finished" podID="51c97bd2-5b45-441e-84a0-6c8527e6691b" containerID="1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e" exitCode=2 Apr 18 02:56:07.382926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382561 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99fbf494f-hl6gk" Apr 18 02:56:07.382926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99fbf494f-hl6gk" event={"ID":"51c97bd2-5b45-441e-84a0-6c8527e6691b","Type":"ContainerDied","Data":"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e"} Apr 18 02:56:07.382926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382617 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99fbf494f-hl6gk" event={"ID":"51c97bd2-5b45-441e-84a0-6c8527e6691b","Type":"ContainerDied","Data":"8b351aee3f2ad0e90c549b2efc29d6ce3ff13051da23c3a5421224c627a82900"} Apr 18 02:56:07.382926 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.382633 2577 scope.go:117] "RemoveContainer" containerID="1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e" Apr 18 02:56:07.391031 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.391011 2577 scope.go:117] "RemoveContainer" containerID="1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e" Apr 18 02:56:07.391306 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:56:07.391288 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e\": container with ID starting with 1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e not found: ID does not exist" containerID="1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e" Apr 18 02:56:07.391354 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.391315 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e"} err="failed to get container status \"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e\": rpc error: code = NotFound desc = could not find container \"1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e\": container with ID starting with 1ccb2ae95f02960f98c93d0616c3498df329f0d4e370788672eaa942e2a5487e not found: ID does not exist" Apr 18 02:56:07.400796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.400748 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:56:07.403904 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:07.403880 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-99fbf494f-hl6gk"] Apr 18 02:56:09.362481 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:09.362449 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c97bd2-5b45-441e-84a0-6c8527e6691b" path="/var/lib/kubelet/pods/51c97bd2-5b45-441e-84a0-6c8527e6691b/volumes" Apr 18 02:56:13.998510 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:13.998477 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm"] Apr 18 02:56:13.999264 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:13.998852 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51c97bd2-5b45-441e-84a0-6c8527e6691b" containerName="console" Apr 18 02:56:13.999264 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:13.998865 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c97bd2-5b45-441e-84a0-6c8527e6691b" containerName="console" Apr 18 02:56:13.999264 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:13.998952 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51c97bd2-5b45-441e-84a0-6c8527e6691b" containerName="console" Apr 18 02:56:14.003780 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.003752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.010849 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.010808 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-vlpvb\"" Apr 18 02:56:14.023161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.023126 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm"] Apr 18 02:56:14.100916 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.100879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101107 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.100979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101107 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101107 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101277 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101277 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmv6\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-kube-api-access-hgmv6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101277 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101277 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.101450 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.101307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202433 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202660 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmv6\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-kube-api-access-hgmv6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.202923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.202882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.203129 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.203101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.203349 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.203324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.203628 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.203605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.204139 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.204114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.205619 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.205591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.205995 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.205973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.211434 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.211411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.211738 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.211718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmv6\" (UniqueName: \"kubernetes.io/projected/e3ce470d-8790-41a4-9bbe-2a771cb5191c-kube-api-access-hgmv6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-nchxm\" (UID: \"e3ce470d-8790-41a4-9bbe-2a771cb5191c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.316945 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.316811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:14.455606 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.455511 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm"] Apr 18 02:56:14.462117 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.462078 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:56:14.462234 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.462163 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:56:14.462234 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:14.462206 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 18 02:56:15.414824 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:15.414791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" event={"ID":"e3ce470d-8790-41a4-9bbe-2a771cb5191c","Type":"ContainerStarted","Data":"a1c5fe188e7a2d1fad71f51cca00c3b399b83fa981c1fa476c9afb01d306a128"} Apr 18 02:56:15.414824 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:15.414828 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" event={"ID":"e3ce470d-8790-41a4-9bbe-2a771cb5191c","Type":"ContainerStarted","Data":"f64fb1ea69cdc321590f8684d711b2f6bb4377e17216b8ce842775d6f4109dc7"} Apr 18 02:56:15.433524 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:15.433470 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" podStartSLOduration=2.433454761 podStartE2EDuration="2.433454761s" podCreationTimestamp="2026-04-18 02:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:56:15.430786748 +0000 UTC m=+618.570638082" watchObservedRunningTime="2026-04-18 02:56:15.433454761 +0000 UTC m=+618.573306093" Apr 18 02:56:16.317004 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:16.316964 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:16.322099 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:16.322076 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:16.418247 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:16.418218 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:16.419002 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:16.418985 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-nchxm" Apr 18 02:56:19.156177 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.156135 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:19.159437 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.159416 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:19.161828 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.161807 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-gkf5k\"" Apr 18 02:56:19.166072 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.166044 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:19.247169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.247102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g5g2\" (UniqueName: \"kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2\") pod \"authorino-7498df8756-jwwtx\" (UID: \"27ed257a-7ad2-4757-85e5-8250aa33076d\") " pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:19.348631 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.348588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g5g2\" (UniqueName: \"kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2\") pod \"authorino-7498df8756-jwwtx\" (UID: \"27ed257a-7ad2-4757-85e5-8250aa33076d\") " pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:19.360303 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.360273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g5g2\" (UniqueName: \"kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2\") pod \"authorino-7498df8756-jwwtx\" (UID: \"27ed257a-7ad2-4757-85e5-8250aa33076d\") " pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:19.472687 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.472606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:19.597268 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:19.597242 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:19.599255 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:56:19.599221 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ed257a_7ad2_4757_85e5_8250aa33076d.slice/crio-2b37d428fd7ff04b3249c02adcbcce368e16f25f1a8862bae3f180d8029a67fc WatchSource:0}: Error finding container 2b37d428fd7ff04b3249c02adcbcce368e16f25f1a8862bae3f180d8029a67fc: Status 404 returned error can't find the container with id 2b37d428fd7ff04b3249c02adcbcce368e16f25f1a8862bae3f180d8029a67fc Apr 18 02:56:20.436471 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:20.436435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jwwtx" event={"ID":"27ed257a-7ad2-4757-85e5-8250aa33076d","Type":"ContainerStarted","Data":"2b37d428fd7ff04b3249c02adcbcce368e16f25f1a8862bae3f180d8029a67fc"} Apr 18 02:56:22.447204 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:22.447159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jwwtx" event={"ID":"27ed257a-7ad2-4757-85e5-8250aa33076d","Type":"ContainerStarted","Data":"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b"} Apr 18 02:56:22.462694 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:22.462636 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-jwwtx" podStartSLOduration=1.266076058 podStartE2EDuration="3.462617409s" podCreationTimestamp="2026-04-18 02:56:19 +0000 UTC" firstStartedPulling="2026-04-18 02:56:19.600597653 +0000 UTC m=+622.740448966" lastFinishedPulling="2026-04-18 02:56:21.797139004 +0000 UTC m=+624.936990317" observedRunningTime="2026-04-18 02:56:22.459866995 +0000 UTC m=+625.599718329" watchObservedRunningTime="2026-04-18 02:56:22.462617409 +0000 UTC m=+625.602468744" Apr 18 02:56:48.203360 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.203327 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:48.203895 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.203604 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-jwwtx" podUID="27ed257a-7ad2-4757-85e5-8250aa33076d" containerName="authorino" containerID="cri-o://9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b" gracePeriod=30 Apr 18 02:56:48.446759 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.446734 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:48.546024 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.545993 2577 generic.go:358] "Generic (PLEG): container finished" podID="27ed257a-7ad2-4757-85e5-8250aa33076d" containerID="9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b" exitCode=0 Apr 18 02:56:48.546164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.546042 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jwwtx" Apr 18 02:56:48.546164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.546047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jwwtx" event={"ID":"27ed257a-7ad2-4757-85e5-8250aa33076d","Type":"ContainerDied","Data":"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b"} Apr 18 02:56:48.546164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.546073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jwwtx" event={"ID":"27ed257a-7ad2-4757-85e5-8250aa33076d","Type":"ContainerDied","Data":"2b37d428fd7ff04b3249c02adcbcce368e16f25f1a8862bae3f180d8029a67fc"} Apr 18 02:56:48.546164 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.546097 2577 scope.go:117] "RemoveContainer" containerID="9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b" Apr 18 02:56:48.554621 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.554601 2577 scope.go:117] "RemoveContainer" containerID="9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b" Apr 18 02:56:48.554869 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:56:48.554850 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b\": container with ID starting with 9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b not found: ID does not exist" containerID="9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b" Apr 18 02:56:48.554923 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.554877 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b"} err="failed to get container status \"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b\": rpc error: code = NotFound desc = could not find container \"9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b\": container with ID starting with 9f54238cdeddee440fcf4d3e1a9f7a13911def7f9cef3c43f47d83f180e0d61b not found: ID does not exist" Apr 18 02:56:48.609236 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.609201 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g5g2\" (UniqueName: \"kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2\") pod \"27ed257a-7ad2-4757-85e5-8250aa33076d\" (UID: \"27ed257a-7ad2-4757-85e5-8250aa33076d\") " Apr 18 02:56:48.611293 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.611264 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2" (OuterVolumeSpecName: "kube-api-access-9g5g2") pod "27ed257a-7ad2-4757-85e5-8250aa33076d" (UID: "27ed257a-7ad2-4757-85e5-8250aa33076d"). InnerVolumeSpecName "kube-api-access-9g5g2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:48.709965 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.709930 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9g5g2\" (UniqueName: \"kubernetes.io/projected/27ed257a-7ad2-4757-85e5-8250aa33076d-kube-api-access-9g5g2\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:48.869432 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.869399 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:48.875566 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.875530 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-jwwtx"] Apr 18 02:56:48.980907 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.980876 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:48.981267 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.981253 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27ed257a-7ad2-4757-85e5-8250aa33076d" containerName="authorino" Apr 18 02:56:48.981315 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.981268 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ed257a-7ad2-4757-85e5-8250aa33076d" containerName="authorino" Apr 18 02:56:48.981364 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.981325 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="27ed257a-7ad2-4757-85e5-8250aa33076d" containerName="authorino" Apr 18 02:56:48.985626 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.985607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:48.988406 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.988375 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6qtxj\"" Apr 18 02:56:48.993928 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:48.993907 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:49.114332 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.114239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw52j\" (UniqueName: \"kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j\") pod \"maas-controller-6d4c8f55f9-whdgl\" (UID: \"af344854-4eaf-45cf-9bce-a23ee1b16e28\") " pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:49.137602 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.137570 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:56:49.140997 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.140982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:49.147598 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.147540 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:56:49.215860 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.215827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw52j\" (UniqueName: \"kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j\") pod \"maas-controller-6d4c8f55f9-whdgl\" (UID: \"af344854-4eaf-45cf-9bce-a23ee1b16e28\") " pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:49.223281 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.223248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw52j\" (UniqueName: \"kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j\") pod \"maas-controller-6d4c8f55f9-whdgl\" (UID: \"af344854-4eaf-45cf-9bce-a23ee1b16e28\") " pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:49.254113 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.254079 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:49.254386 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.254372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:49.275482 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.275450 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:56:49.281173 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.280788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:49.284078 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.284048 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:56:49.317402 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.317366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj59q\" (UniqueName: \"kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q\") pod \"maas-controller-855b5fc8cd-xgkm6\" (UID: \"981b405e-e456-483b-bf75-f8ef03a56b4b\") " pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:49.363880 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.363845 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ed257a-7ad2-4757-85e5-8250aa33076d" path="/var/lib/kubelet/pods/27ed257a-7ad2-4757-85e5-8250aa33076d/volumes" Apr 18 02:56:49.391528 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.391502 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:49.393950 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:56:49.393915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf344854_4eaf_45cf_9bce_a23ee1b16e28.slice/crio-17c825f957c1a74bf685c9cc3dfcff6ec6bff9e3d541bfc9da84ed6c76e8714c WatchSource:0}: Error finding container 17c825f957c1a74bf685c9cc3dfcff6ec6bff9e3d541bfc9da84ed6c76e8714c: Status 404 returned error can't find the container with id 17c825f957c1a74bf685c9cc3dfcff6ec6bff9e3d541bfc9da84ed6c76e8714c Apr 18 02:56:49.396235 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.396215 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:56:49.418077 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.418051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj59q\" (UniqueName: \"kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q\") pod \"maas-controller-855b5fc8cd-xgkm6\" (UID: \"981b405e-e456-483b-bf75-f8ef03a56b4b\") " pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:49.418169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.418095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhnb4\" (UniqueName: \"kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4\") pod \"maas-controller-874df8574-9pd27\" (UID: \"63e6b508-15b0-4732-9245-e19e2ce5ae5d\") " pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:49.425210 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.425190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj59q\" (UniqueName: \"kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q\") pod \"maas-controller-855b5fc8cd-xgkm6\" (UID: \"981b405e-e456-483b-bf75-f8ef03a56b4b\") " pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:49.452149 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.452127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:49.518638 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.518608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhnb4\" (UniqueName: \"kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4\") pod \"maas-controller-874df8574-9pd27\" (UID: \"63e6b508-15b0-4732-9245-e19e2ce5ae5d\") " pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:49.526695 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.526538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhnb4\" (UniqueName: \"kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4\") pod \"maas-controller-874df8574-9pd27\" (UID: \"63e6b508-15b0-4732-9245-e19e2ce5ae5d\") " pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:49.553782 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.553741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" event={"ID":"af344854-4eaf-45cf-9bce-a23ee1b16e28","Type":"ContainerStarted","Data":"17c825f957c1a74bf685c9cc3dfcff6ec6bff9e3d541bfc9da84ed6c76e8714c"} Apr 18 02:56:49.572673 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.572647 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:56:49.574540 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:56:49.574507 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981b405e_e456_483b_bf75_f8ef03a56b4b.slice/crio-21da6f06d5448b23d2b2c7889617ea1720354938944d6c2307f92e74fd92f945 WatchSource:0}: Error finding container 21da6f06d5448b23d2b2c7889617ea1720354938944d6c2307f92e74fd92f945: Status 404 returned error can't find the container with id 21da6f06d5448b23d2b2c7889617ea1720354938944d6c2307f92e74fd92f945 Apr 18 02:56:49.595010 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.594984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:49.712942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:49.712912 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:56:49.715445 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:56:49.715405 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e6b508_15b0_4732_9245_e19e2ce5ae5d.slice/crio-f3d9296be8a0ad2f4f06d77f042f9a6509b87d0f20dc2762169e00f1cefb99cb WatchSource:0}: Error finding container f3d9296be8a0ad2f4f06d77f042f9a6509b87d0f20dc2762169e00f1cefb99cb: Status 404 returned error can't find the container with id f3d9296be8a0ad2f4f06d77f042f9a6509b87d0f20dc2762169e00f1cefb99cb Apr 18 02:56:50.566473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:50.566419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-874df8574-9pd27" event={"ID":"63e6b508-15b0-4732-9245-e19e2ce5ae5d","Type":"ContainerStarted","Data":"f3d9296be8a0ad2f4f06d77f042f9a6509b87d0f20dc2762169e00f1cefb99cb"} Apr 18 02:56:50.568106 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:50.568078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" event={"ID":"981b405e-e456-483b-bf75-f8ef03a56b4b","Type":"ContainerStarted","Data":"21da6f06d5448b23d2b2c7889617ea1720354938944d6c2307f92e74fd92f945"} Apr 18 02:56:53.582003 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.581967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" event={"ID":"981b405e-e456-483b-bf75-f8ef03a56b4b","Type":"ContainerStarted","Data":"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a"} Apr 18 02:56:53.582538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.582036 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:56:53.583605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.583572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" event={"ID":"af344854-4eaf-45cf-9bce-a23ee1b16e28","Type":"ContainerStarted","Data":"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7"} Apr 18 02:56:53.583605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.583606 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:53.583777 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.583578 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" podUID="af344854-4eaf-45cf-9bce-a23ee1b16e28" containerName="manager" containerID="cri-o://108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7" gracePeriod=10 Apr 18 02:56:53.584976 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.584953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-874df8574-9pd27" event={"ID":"63e6b508-15b0-4732-9245-e19e2ce5ae5d","Type":"ContainerStarted","Data":"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183"} Apr 18 02:56:53.585102 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.585082 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:56:53.598261 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.598139 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" podStartSLOduration=1.581934108 podStartE2EDuration="4.598127097s" podCreationTimestamp="2026-04-18 02:56:49 +0000 UTC" firstStartedPulling="2026-04-18 02:56:49.575930052 +0000 UTC m=+652.715781365" lastFinishedPulling="2026-04-18 02:56:52.592123037 +0000 UTC m=+655.731974354" observedRunningTime="2026-04-18 02:56:53.596619254 +0000 UTC m=+656.736470589" watchObservedRunningTime="2026-04-18 02:56:53.598127097 +0000 UTC m=+656.737978482" Apr 18 02:56:53.612749 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.612713 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" podStartSLOduration=2.417062703 podStartE2EDuration="5.612702251s" podCreationTimestamp="2026-04-18 02:56:48 +0000 UTC" firstStartedPulling="2026-04-18 02:56:49.396351642 +0000 UTC m=+652.536202955" lastFinishedPulling="2026-04-18 02:56:52.59199119 +0000 UTC m=+655.731842503" observedRunningTime="2026-04-18 02:56:53.611013369 +0000 UTC m=+656.750864705" watchObservedRunningTime="2026-04-18 02:56:53.612702251 +0000 UTC m=+656.752553585" Apr 18 02:56:53.627365 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.627329 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-874df8574-9pd27" podStartSLOduration=1.745092603 podStartE2EDuration="4.627319448s" podCreationTimestamp="2026-04-18 02:56:49 +0000 UTC" firstStartedPulling="2026-04-18 02:56:49.717159395 +0000 UTC m=+652.857010713" lastFinishedPulling="2026-04-18 02:56:52.599386245 +0000 UTC m=+655.739237558" observedRunningTime="2026-04-18 02:56:53.624941907 +0000 UTC m=+656.764793241" watchObservedRunningTime="2026-04-18 02:56:53.627319448 +0000 UTC m=+656.767170832" Apr 18 02:56:53.825425 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.825397 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:53.963771 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.963694 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw52j\" (UniqueName: \"kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j\") pod \"af344854-4eaf-45cf-9bce-a23ee1b16e28\" (UID: \"af344854-4eaf-45cf-9bce-a23ee1b16e28\") " Apr 18 02:56:53.965802 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:53.965775 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j" (OuterVolumeSpecName: "kube-api-access-lw52j") pod "af344854-4eaf-45cf-9bce-a23ee1b16e28" (UID: "af344854-4eaf-45cf-9bce-a23ee1b16e28"). InnerVolumeSpecName "kube-api-access-lw52j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:54.065322 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.065298 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lw52j\" (UniqueName: \"kubernetes.io/projected/af344854-4eaf-45cf-9bce-a23ee1b16e28-kube-api-access-lw52j\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:56:54.590310 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.590270 2577 generic.go:358] "Generic (PLEG): container finished" podID="af344854-4eaf-45cf-9bce-a23ee1b16e28" containerID="108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7" exitCode=0 Apr 18 02:56:54.590792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.590332 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" Apr 18 02:56:54.590792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.590350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" event={"ID":"af344854-4eaf-45cf-9bce-a23ee1b16e28","Type":"ContainerDied","Data":"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7"} Apr 18 02:56:54.590792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.590401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-whdgl" event={"ID":"af344854-4eaf-45cf-9bce-a23ee1b16e28","Type":"ContainerDied","Data":"17c825f957c1a74bf685c9cc3dfcff6ec6bff9e3d541bfc9da84ed6c76e8714c"} Apr 18 02:56:54.590792 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.590430 2577 scope.go:117] "RemoveContainer" containerID="108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7" Apr 18 02:56:54.599839 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.599816 2577 scope.go:117] "RemoveContainer" containerID="108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7" Apr 18 02:56:54.600105 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:56:54.600083 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7\": container with ID starting with 108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7 not found: ID does not exist" containerID="108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7" Apr 18 02:56:54.600170 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.600114 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7"} err="failed to get container status \"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7\": rpc error: code = NotFound desc = could not find container \"108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7\": container with ID starting with 108612c1877e833d0c96dc19385932be28c75fb087df4fef42d2c59fdc21bad7 not found: ID does not exist" Apr 18 02:56:54.617012 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.616988 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:54.623288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.623268 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-whdgl"] Apr 18 02:56:54.694244 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.694215 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:56:54.694755 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.694741 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af344854-4eaf-45cf-9bce-a23ee1b16e28" containerName="manager" Apr 18 02:56:54.694808 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.694759 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="af344854-4eaf-45cf-9bce-a23ee1b16e28" containerName="manager" Apr 18 02:56:54.694875 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.694865 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="af344854-4eaf-45cf-9bce-a23ee1b16e28" containerName="manager" Apr 18 02:56:54.699725 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.699709 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.703065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.702846 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 18 02:56:54.703065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.702850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-j9kkg\"" Apr 18 02:56:54.703065 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.702954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 18 02:56:54.705839 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.705818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:56:54.872294 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.872202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.872294 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.872247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxgpx\" (UniqueName: \"kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.973500 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.973467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.973685 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.973510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxgpx\" (UniqueName: \"kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.975918 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.975894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:54.980925 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:54.980900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxgpx\" (UniqueName: \"kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx\") pod \"maas-api-965d75d4b-gxsqw\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:55.012757 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:55.012734 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:55.135716 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:55.135685 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:56:55.137868 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:56:55.137832 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64567602_f663_487c_8beb_bb884e3abfde.slice/crio-e4a3420d3a1d5a2c9279bd429fb2924b806112237c9e643889d17e4019871588 WatchSource:0}: Error finding container e4a3420d3a1d5a2c9279bd429fb2924b806112237c9e643889d17e4019871588: Status 404 returned error can't find the container with id e4a3420d3a1d5a2c9279bd429fb2924b806112237c9e643889d17e4019871588 Apr 18 02:56:55.364832 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:55.364797 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af344854-4eaf-45cf-9bce-a23ee1b16e28" path="/var/lib/kubelet/pods/af344854-4eaf-45cf-9bce-a23ee1b16e28/volumes" Apr 18 02:56:55.595346 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:55.595303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-965d75d4b-gxsqw" event={"ID":"64567602-f663-487c-8beb-bb884e3abfde","Type":"ContainerStarted","Data":"e4a3420d3a1d5a2c9279bd429fb2924b806112237c9e643889d17e4019871588"} Apr 18 02:56:57.605796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:57.605703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-965d75d4b-gxsqw" event={"ID":"64567602-f663-487c-8beb-bb884e3abfde","Type":"ContainerStarted","Data":"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4"} Apr 18 02:56:57.606126 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:57.605803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:56:57.623175 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:56:57.623121 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-965d75d4b-gxsqw" podStartSLOduration=1.834502268 podStartE2EDuration="3.6231088s" podCreationTimestamp="2026-04-18 02:56:54 +0000 UTC" firstStartedPulling="2026-04-18 02:56:55.139464334 +0000 UTC m=+658.279315647" lastFinishedPulling="2026-04-18 02:56:56.928070852 +0000 UTC m=+660.067922179" observedRunningTime="2026-04-18 02:56:57.619833173 +0000 UTC m=+660.759684507" watchObservedRunningTime="2026-04-18 02:56:57.6231088 +0000 UTC m=+660.762960134" Apr 18 02:57:03.615811 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:03.615781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:57:04.595912 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.595882 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:57:04.596083 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.595935 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:57:04.635330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.635287 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:57:04.635719 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.635483 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" podUID="981b405e-e456-483b-bf75-f8ef03a56b4b" containerName="manager" containerID="cri-o://3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a" gracePeriod=10 Apr 18 02:57:04.877804 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.877781 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:57:04.931811 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.931783 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 02:57:04.932160 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.932148 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="981b405e-e456-483b-bf75-f8ef03a56b4b" containerName="manager" Apr 18 02:57:04.932205 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.932162 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="981b405e-e456-483b-bf75-f8ef03a56b4b" containerName="manager" Apr 18 02:57:04.932238 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.932220 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="981b405e-e456-483b-bf75-f8ef03a56b4b" containerName="manager" Apr 18 02:57:04.935830 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.935814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:04.941607 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.941584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 02:57:04.954744 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.954721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj59q\" (UniqueName: \"kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q\") pod \"981b405e-e456-483b-bf75-f8ef03a56b4b\" (UID: \"981b405e-e456-483b-bf75-f8ef03a56b4b\") " Apr 18 02:57:04.954996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.954976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrdh\" (UniqueName: \"kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh\") pod \"maas-controller-7dbc9c957b-7447h\" (UID: \"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b\") " pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:04.956734 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:04.956711 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q" (OuterVolumeSpecName: "kube-api-access-pj59q") pod "981b405e-e456-483b-bf75-f8ef03a56b4b" (UID: "981b405e-e456-483b-bf75-f8ef03a56b4b"). InnerVolumeSpecName "kube-api-access-pj59q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:57:05.056114 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.056087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrdh\" (UniqueName: \"kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh\") pod \"maas-controller-7dbc9c957b-7447h\" (UID: \"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b\") " pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:05.056229 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.056128 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pj59q\" (UniqueName: \"kubernetes.io/projected/981b405e-e456-483b-bf75-f8ef03a56b4b-kube-api-access-pj59q\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:57:05.063371 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.063348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrdh\" (UniqueName: \"kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh\") pod \"maas-controller-7dbc9c957b-7447h\" (UID: \"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b\") " pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:05.248291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.248272 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:05.574070 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.574046 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 02:57:05.575472 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:57:05.575445 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0c4bc6_d1eb_47fa_92b1_630f7c0ce58b.slice/crio-3d85ca8d2031fc3856178b38ceb38c2eb63aab1a2a22d42066403627dffde870 WatchSource:0}: Error finding container 3d85ca8d2031fc3856178b38ceb38c2eb63aab1a2a22d42066403627dffde870: Status 404 returned error can't find the container with id 3d85ca8d2031fc3856178b38ceb38c2eb63aab1a2a22d42066403627dffde870 Apr 18 02:57:05.637762 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.637729 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-7447h" event={"ID":"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b","Type":"ContainerStarted","Data":"3d85ca8d2031fc3856178b38ceb38c2eb63aab1a2a22d42066403627dffde870"} Apr 18 02:57:05.638824 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.638800 2577 generic.go:358] "Generic (PLEG): container finished" podID="981b405e-e456-483b-bf75-f8ef03a56b4b" containerID="3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a" exitCode=0 Apr 18 02:57:05.638927 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.638862 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" Apr 18 02:57:05.638974 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.638867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" event={"ID":"981b405e-e456-483b-bf75-f8ef03a56b4b","Type":"ContainerDied","Data":"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a"} Apr 18 02:57:05.638974 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.638953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-855b5fc8cd-xgkm6" event={"ID":"981b405e-e456-483b-bf75-f8ef03a56b4b","Type":"ContainerDied","Data":"21da6f06d5448b23d2b2c7889617ea1720354938944d6c2307f92e74fd92f945"} Apr 18 02:57:05.638974 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.638972 2577 scope.go:117] "RemoveContainer" containerID="3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a" Apr 18 02:57:05.647446 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.647429 2577 scope.go:117] "RemoveContainer" containerID="3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a" Apr 18 02:57:05.647728 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:57:05.647706 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a\": container with ID starting with 3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a not found: ID does not exist" containerID="3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a" Apr 18 02:57:05.647832 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.647735 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a"} err="failed to get container status \"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a\": rpc error: code = NotFound desc = could not find container \"3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a\": container with ID starting with 3104ab6ae5ac25482e6f294c36c59204592f1cd911ab34ed8db8757814c6469a not found: ID does not exist" Apr 18 02:57:05.657514 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.657493 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:57:05.661622 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:05.661602 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-855b5fc8cd-xgkm6"] Apr 18 02:57:06.644186 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:06.644149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-7447h" event={"ID":"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b","Type":"ContainerStarted","Data":"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf"} Apr 18 02:57:06.644659 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:06.644282 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:06.659880 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:06.659843 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7dbc9c957b-7447h" podStartSLOduration=2.336908664 podStartE2EDuration="2.659830634s" podCreationTimestamp="2026-04-18 02:57:04 +0000 UTC" firstStartedPulling="2026-04-18 02:57:05.57681195 +0000 UTC m=+668.716663263" lastFinishedPulling="2026-04-18 02:57:05.89973391 +0000 UTC m=+669.039585233" observedRunningTime="2026-04-18 02:57:06.657455706 +0000 UTC m=+669.797307033" watchObservedRunningTime="2026-04-18 02:57:06.659830634 +0000 UTC m=+669.799682012" Apr 18 02:57:07.363128 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:07.363096 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981b405e-e456-483b-bf75-f8ef03a56b4b" path="/var/lib/kubelet/pods/981b405e-e456-483b-bf75-f8ef03a56b4b/volumes" Apr 18 02:57:17.661358 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:17.661324 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 02:57:17.698815 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:17.698790 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:57:17.699009 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:17.698990 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-874df8574-9pd27" podUID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" containerName="manager" containerID="cri-o://b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183" gracePeriod=10 Apr 18 02:57:17.940091 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:17.940066 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:57:18.060383 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.060354 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhnb4\" (UniqueName: \"kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4\") pod \"63e6b508-15b0-4732-9245-e19e2ce5ae5d\" (UID: \"63e6b508-15b0-4732-9245-e19e2ce5ae5d\") " Apr 18 02:57:18.062351 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.062326 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4" (OuterVolumeSpecName: "kube-api-access-dhnb4") pod "63e6b508-15b0-4732-9245-e19e2ce5ae5d" (UID: "63e6b508-15b0-4732-9245-e19e2ce5ae5d"). InnerVolumeSpecName "kube-api-access-dhnb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:57:18.160910 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.160883 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhnb4\" (UniqueName: \"kubernetes.io/projected/63e6b508-15b0-4732-9245-e19e2ce5ae5d-kube-api-access-dhnb4\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:57:18.693407 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.693366 2577 generic.go:358] "Generic (PLEG): container finished" podID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" containerID="b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183" exitCode=0 Apr 18 02:57:18.693841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.693429 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-874df8574-9pd27" Apr 18 02:57:18.693841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.693439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-874df8574-9pd27" event={"ID":"63e6b508-15b0-4732-9245-e19e2ce5ae5d","Type":"ContainerDied","Data":"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183"} Apr 18 02:57:18.693841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.693476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-874df8574-9pd27" event={"ID":"63e6b508-15b0-4732-9245-e19e2ce5ae5d","Type":"ContainerDied","Data":"f3d9296be8a0ad2f4f06d77f042f9a6509b87d0f20dc2762169e00f1cefb99cb"} Apr 18 02:57:18.693841 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.693493 2577 scope.go:117] "RemoveContainer" containerID="b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183" Apr 18 02:57:18.702920 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.702724 2577 scope.go:117] "RemoveContainer" containerID="b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183" Apr 18 02:57:18.703002 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:57:18.702978 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183\": container with ID starting with b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183 not found: ID does not exist" containerID="b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183" Apr 18 02:57:18.703046 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.703016 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183"} err="failed to get container status \"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183\": rpc error: code = NotFound desc = could not find container \"b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183\": container with ID starting with b92eb8c5646a50b95c2b58148e0fd871cc68cf7ffa9df1619919cad820dcf183 not found: ID does not exist" Apr 18 02:57:18.716605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.716576 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:57:18.718908 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:18.718889 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-874df8574-9pd27"] Apr 18 02:57:19.362363 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:19.362330 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" path="/var/lib/kubelet/pods/63e6b508-15b0-4732-9245-e19e2ce5ae5d/volumes" Apr 18 02:57:37.181938 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.181900 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-f4b88597d-jw69s"] Apr 18 02:57:37.182337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.182294 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" containerName="manager" Apr 18 02:57:37.182337 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.182306 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" containerName="manager" Apr 18 02:57:37.182408 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.182375 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="63e6b508-15b0-4732-9245-e19e2ce5ae5d" containerName="manager" Apr 18 02:57:37.186884 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.186864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.191201 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.191174 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f4b88597d-jw69s"] Apr 18 02:57:37.307942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.307910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzhr\" (UniqueName: \"kubernetes.io/projected/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-kube-api-access-9lzhr\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.307942 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.307944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-maas-api-tls\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.409066 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.409029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzhr\" (UniqueName: \"kubernetes.io/projected/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-kube-api-access-9lzhr\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.409066 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.409073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-maas-api-tls\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.411338 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.411311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-maas-api-tls\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.416980 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.416955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzhr\" (UniqueName: \"kubernetes.io/projected/d64cc1eb-aa61-49e6-9d51-529291d5c5e5-kube-api-access-9lzhr\") pod \"maas-api-f4b88597d-jw69s\" (UID: \"d64cc1eb-aa61-49e6-9d51-529291d5c5e5\") " pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.500096 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.500061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:37.626989 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.626943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f4b88597d-jw69s"] Apr 18 02:57:37.778575 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:37.778475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4b88597d-jw69s" event={"ID":"d64cc1eb-aa61-49e6-9d51-529291d5c5e5","Type":"ContainerStarted","Data":"1301a84ad8d62afd378a641b0bf375c4b2280d9a7281221fce1668fa17de8a1c"} Apr 18 02:57:39.171024 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.170993 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj"] Apr 18 02:57:39.174724 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.174702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.177115 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.177091 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 18 02:57:39.178131 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.178099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 18 02:57:39.178242 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.178203 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 18 02:57:39.178491 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.178471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-62ckn\"" Apr 18 02:57:39.182507 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.182480 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj"] Apr 18 02:57:39.327257 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwgv\" (UniqueName: \"kubernetes.io/projected/c72afa4c-f326-44c2-8451-e605fb9dbbda-kube-api-access-zjwgv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.327257 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.327435 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.327435 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.327435 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.327435 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.327375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c72afa4c-f326-44c2-8451-e605fb9dbbda-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.428633 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c72afa4c-f326-44c2-8451-e605fb9dbbda-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.428789 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwgv\" (UniqueName: \"kubernetes.io/projected/c72afa4c-f326-44c2-8451-e605fb9dbbda-kube-api-access-zjwgv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.428789 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.428789 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.428948 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.429020 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.428994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.429127 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.429112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.429190 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.429164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.429237 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.429185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.431538 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.431514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c72afa4c-f326-44c2-8451-e605fb9dbbda-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.436796 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.436769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c72afa4c-f326-44c2-8451-e605fb9dbbda-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.437716 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.437691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwgv\" (UniqueName: \"kubernetes.io/projected/c72afa4c-f326-44c2-8451-e605fb9dbbda-kube-api-access-zjwgv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-mzxhj\" (UID: \"c72afa4c-f326-44c2-8451-e605fb9dbbda\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.485606 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.485546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:57:39.638143 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.638069 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj"] Apr 18 02:57:39.641642 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:57:39.641611 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72afa4c_f326_44c2_8451_e605fb9dbbda.slice/crio-483dc3d4cd22ee6584adac22b08b389f3a68a7acdd8b23125cf4b28cc4964bc2 WatchSource:0}: Error finding container 483dc3d4cd22ee6584adac22b08b389f3a68a7acdd8b23125cf4b28cc4964bc2: Status 404 returned error can't find the container with id 483dc3d4cd22ee6584adac22b08b389f3a68a7acdd8b23125cf4b28cc4964bc2 Apr 18 02:57:39.788584 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.788544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4b88597d-jw69s" event={"ID":"d64cc1eb-aa61-49e6-9d51-529291d5c5e5","Type":"ContainerStarted","Data":"ef91aa8e550f6db0bafe8110b5f647a38ee3529ca10663cfa923d9484b745fa2"} Apr 18 02:57:39.788733 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.788681 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:39.789592 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.789540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" event={"ID":"c72afa4c-f326-44c2-8451-e605fb9dbbda","Type":"ContainerStarted","Data":"483dc3d4cd22ee6584adac22b08b389f3a68a7acdd8b23125cf4b28cc4964bc2"} Apr 18 02:57:39.804706 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:39.804663 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-f4b88597d-jw69s" podStartSLOduration=1.426839629 podStartE2EDuration="2.80464908s" podCreationTimestamp="2026-04-18 02:57:37 +0000 UTC" firstStartedPulling="2026-04-18 02:57:37.639004585 +0000 UTC m=+700.778855907" lastFinishedPulling="2026-04-18 02:57:39.016814042 +0000 UTC m=+702.156665358" observedRunningTime="2026-04-18 02:57:39.802815481 +0000 UTC m=+702.942666818" watchObservedRunningTime="2026-04-18 02:57:39.80464908 +0000 UTC m=+702.944500416" Apr 18 02:57:45.799605 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:45.799518 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-f4b88597d-jw69s" Apr 18 02:57:45.843079 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:45.843050 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:57:45.843361 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:45.843338 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-965d75d4b-gxsqw" podUID="64567602-f663-487c-8beb-bb884e3abfde" containerName="maas-api" containerID="cri-o://d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4" gracePeriod=30 Apr 18 02:57:46.095300 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.095278 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:57:46.203246 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.203212 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxgpx\" (UniqueName: \"kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx\") pod \"64567602-f663-487c-8beb-bb884e3abfde\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " Apr 18 02:57:46.203450 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.203281 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls\") pod \"64567602-f663-487c-8beb-bb884e3abfde\" (UID: \"64567602-f663-487c-8beb-bb884e3abfde\") " Apr 18 02:57:46.205267 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.205231 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "64567602-f663-487c-8beb-bb884e3abfde" (UID: "64567602-f663-487c-8beb-bb884e3abfde"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:57:46.205360 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.205281 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx" (OuterVolumeSpecName: "kube-api-access-cxgpx") pod "64567602-f663-487c-8beb-bb884e3abfde" (UID: "64567602-f663-487c-8beb-bb884e3abfde"). InnerVolumeSpecName "kube-api-access-cxgpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:57:46.304681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.304644 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxgpx\" (UniqueName: \"kubernetes.io/projected/64567602-f663-487c-8beb-bb884e3abfde-kube-api-access-cxgpx\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:57:46.304681 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.304677 2577 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/64567602-f663-487c-8beb-bb884e3abfde-maas-api-tls\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 02:57:46.822269 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.822206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" event={"ID":"c72afa4c-f326-44c2-8451-e605fb9dbbda","Type":"ContainerStarted","Data":"9648b75046cbb6ee44c4c1408be98d988c94dbc2d18e3a6ea322cbbb4d8eba9d"} Apr 18 02:57:46.823632 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.823604 2577 generic.go:358] "Generic (PLEG): container finished" podID="64567602-f663-487c-8beb-bb884e3abfde" containerID="d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4" exitCode=0 Apr 18 02:57:46.823748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.823671 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-965d75d4b-gxsqw" Apr 18 02:57:46.823748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.823684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-965d75d4b-gxsqw" event={"ID":"64567602-f663-487c-8beb-bb884e3abfde","Type":"ContainerDied","Data":"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4"} Apr 18 02:57:46.823748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.823718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-965d75d4b-gxsqw" event={"ID":"64567602-f663-487c-8beb-bb884e3abfde","Type":"ContainerDied","Data":"e4a3420d3a1d5a2c9279bd429fb2924b806112237c9e643889d17e4019871588"} Apr 18 02:57:46.823748 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.823739 2577 scope.go:117] "RemoveContainer" containerID="d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4" Apr 18 02:57:46.834341 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.834322 2577 scope.go:117] "RemoveContainer" containerID="d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4" Apr 18 02:57:46.834604 ip-10-0-128-79 kubenswrapper[2577]: E0418 02:57:46.834579 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4\": container with ID starting with d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4 not found: ID does not exist" containerID="d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4" Apr 18 02:57:46.834689 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.834616 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4"} err="failed to get container status \"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4\": rpc error: code = NotFound desc = could not find container \"d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4\": container with ID starting with d544bfaa032bd472d4bea0ddefc85f1e1105cef12728a84c35c638391ab834a4 not found: ID does not exist" Apr 18 02:57:46.854336 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.854314 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:57:46.856519 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:46.856499 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-965d75d4b-gxsqw"] Apr 18 02:57:47.363888 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:47.363858 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64567602-f663-487c-8beb-bb884e3abfde" path="/var/lib/kubelet/pods/64567602-f663-487c-8beb-bb884e3abfde/volumes" Apr 18 02:57:51.768305 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.768274 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc"] Apr 18 02:57:51.768678 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.768657 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64567602-f663-487c-8beb-bb884e3abfde" containerName="maas-api" Apr 18 02:57:51.768678 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.768670 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="64567602-f663-487c-8beb-bb884e3abfde" containerName="maas-api" Apr 18 02:57:51.768761 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.768733 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="64567602-f663-487c-8beb-bb884e3abfde" containerName="maas-api" Apr 18 02:57:51.771855 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.771837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.774535 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.774506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 18 02:57:51.789378 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.789355 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc"] Apr 18 02:57:51.848213 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.848187 2577 generic.go:358] "Generic (PLEG): container finished" podID="c72afa4c-f326-44c2-8451-e605fb9dbbda" containerID="9648b75046cbb6ee44c4c1408be98d988c94dbc2d18e3a6ea322cbbb4d8eba9d" exitCode=0 Apr 18 02:57:51.848330 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.848263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" event={"ID":"c72afa4c-f326-44c2-8451-e605fb9dbbda","Type":"ContainerDied","Data":"9648b75046cbb6ee44c4c1408be98d988c94dbc2d18e3a6ea322cbbb4d8eba9d"} Apr 18 02:57:51.958931 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.958896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.959060 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.958935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.959060 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.958956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.959156 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.959047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdnc\" (UniqueName: \"kubernetes.io/projected/220fdfa4-c89d-450a-8399-66a73c0599b5-kube-api-access-sqdnc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.959198 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.959180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:51.959255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:51.959239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/220fdfa4-c89d-450a-8399-66a73c0599b5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/220fdfa4-c89d-450a-8399-66a73c0599b5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060473 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdnc\" (UniqueName: \"kubernetes.io/projected/220fdfa4-c89d-450a-8399-66a73c0599b5-kube-api-access-sqdnc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.060723 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.060681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.061169 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.061117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.061290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.061209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.061290 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.061233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.063913 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.063794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/220fdfa4-c89d-450a-8399-66a73c0599b5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.064034 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.064014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/220fdfa4-c89d-450a-8399-66a73c0599b5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.068887 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.068779 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdnc\" (UniqueName: \"kubernetes.io/projected/220fdfa4-c89d-450a-8399-66a73c0599b5-kube-api-access-sqdnc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc\" (UID: \"220fdfa4-c89d-450a-8399-66a73c0599b5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.082593 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.082521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:52.256349 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.256309 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc"] Apr 18 02:57:52.258372 ip-10-0-128-79 kubenswrapper[2577]: W0418 02:57:52.258339 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220fdfa4_c89d_450a_8399_66a73c0599b5.slice/crio-05c4b566c7938467a69a6f46c5ebb4d638b34369ee393fab2f0e1b01d0c1cba5 WatchSource:0}: Error finding container 05c4b566c7938467a69a6f46c5ebb4d638b34369ee393fab2f0e1b01d0c1cba5: Status 404 returned error can't find the container with id 05c4b566c7938467a69a6f46c5ebb4d638b34369ee393fab2f0e1b01d0c1cba5 Apr 18 02:57:52.854833 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.854743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" event={"ID":"220fdfa4-c89d-450a-8399-66a73c0599b5","Type":"ContainerStarted","Data":"55f769112ab54e8b5521df1e3f899ac3e4d72a50e79cdc1b365f5cce11939212"} Apr 18 02:57:52.854833 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:52.854785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" event={"ID":"220fdfa4-c89d-450a-8399-66a73c0599b5","Type":"ContainerStarted","Data":"05c4b566c7938467a69a6f46c5ebb4d638b34369ee393fab2f0e1b01d0c1cba5"} Apr 18 02:57:53.860255 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:53.860220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" event={"ID":"c72afa4c-f326-44c2-8451-e605fb9dbbda","Type":"ContainerStarted","Data":"30d171f440442979f5657c33f7162e3884f22a061c262bf7c74682b19874e666"} Apr 18 02:57:53.878383 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:53.878337 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" podStartSLOduration=1.374689288 podStartE2EDuration="14.878322718s" podCreationTimestamp="2026-04-18 02:57:39 +0000 UTC" firstStartedPulling="2026-04-18 02:57:39.643309241 +0000 UTC m=+702.783160554" lastFinishedPulling="2026-04-18 02:57:53.14694266 +0000 UTC m=+716.286793984" observedRunningTime="2026-04-18 02:57:53.87613099 +0000 UTC m=+717.015982336" watchObservedRunningTime="2026-04-18 02:57:53.878322718 +0000 UTC m=+717.018174053" Apr 18 02:57:58.881800 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:58.881765 2577 generic.go:358] "Generic (PLEG): container finished" podID="220fdfa4-c89d-450a-8399-66a73c0599b5" containerID="55f769112ab54e8b5521df1e3f899ac3e4d72a50e79cdc1b365f5cce11939212" exitCode=0 Apr 18 02:57:58.882167 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:58.881839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" event={"ID":"220fdfa4-c89d-450a-8399-66a73c0599b5","Type":"ContainerDied","Data":"55f769112ab54e8b5521df1e3f899ac3e4d72a50e79cdc1b365f5cce11939212"} Apr 18 02:57:59.887333 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:59.887298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" event={"ID":"220fdfa4-c89d-450a-8399-66a73c0599b5","Type":"ContainerStarted","Data":"052320b9aa7f0f930c17252146661e7de982775d2a563e05158daa52a790887c"} Apr 18 02:57:59.887742 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:59.887541 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:57:59.905882 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:57:59.905835 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" podStartSLOduration=8.729141318 podStartE2EDuration="8.90582252s" podCreationTimestamp="2026-04-18 02:57:51 +0000 UTC" firstStartedPulling="2026-04-18 02:57:58.882454669 +0000 UTC m=+722.022305982" lastFinishedPulling="2026-04-18 02:57:59.059135868 +0000 UTC m=+722.198987184" observedRunningTime="2026-04-18 02:57:59.902417873 +0000 UTC m=+723.042269208" watchObservedRunningTime="2026-04-18 02:57:59.90582252 +0000 UTC m=+723.045673855" Apr 18 02:58:03.860503 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:03.860460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:58:03.873288 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:03.873264 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-mzxhj" Apr 18 02:58:10.905203 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:10.905166 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc" Apr 18 02:58:15.873591 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.873538 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl"] Apr 18 02:58:15.905531 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.905504 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl"] Apr 18 02:58:15.905685 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.905633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.908042 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.908017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 18 02:58:15.966711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.966711 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zvsv\" (UniqueName: \"kubernetes.io/projected/eca800fe-1b80-453a-9322-748b5d7f9e3d-kube-api-access-6zvsv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.966896 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eca800fe-1b80-453a-9322-748b5d7f9e3d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.966896 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.966896 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:15.966996 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:15.966932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.067886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.067852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eca800fe-1b80-453a-9322-748b5d7f9e3d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.067886 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.067889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068103 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.067932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068103 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.067963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068103 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.068000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068103 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.068026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zvsv\" (UniqueName: \"kubernetes.io/projected/eca800fe-1b80-453a-9322-748b5d7f9e3d-kube-api-access-6zvsv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068455 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.068429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068614 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.068433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.068614 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.068500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.070384 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.070362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eca800fe-1b80-453a-9322-748b5d7f9e3d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.070501 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.070464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eca800fe-1b80-453a-9322-748b5d7f9e3d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.075291 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.075267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zvsv\" (UniqueName: \"kubernetes.io/projected/eca800fe-1b80-453a-9322-748b5d7f9e3d-kube-api-access-6zvsv\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl\" (UID: \"eca800fe-1b80-453a-9322-748b5d7f9e3d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.216161 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.216082 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:16.346571 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.346368 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl"] Apr 18 02:58:16.960494 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.960461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" event={"ID":"eca800fe-1b80-453a-9322-748b5d7f9e3d","Type":"ContainerStarted","Data":"7abfd03d0498e860865fad8716be82c3f325a9d1afcad6a890e6cd16ad8142e3"} Apr 18 02:58:16.960494 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:16.960495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" event={"ID":"eca800fe-1b80-453a-9322-748b5d7f9e3d","Type":"ContainerStarted","Data":"2b5e2913ad42dc3ac007fd79956898720ae58606ec0eef71bb294f87e80feb34"} Apr 18 02:58:21.980639 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:21.980599 2577 generic.go:358] "Generic (PLEG): container finished" podID="eca800fe-1b80-453a-9322-748b5d7f9e3d" containerID="7abfd03d0498e860865fad8716be82c3f325a9d1afcad6a890e6cd16ad8142e3" exitCode=0 Apr 18 02:58:21.981015 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:21.980675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" event={"ID":"eca800fe-1b80-453a-9322-748b5d7f9e3d","Type":"ContainerDied","Data":"7abfd03d0498e860865fad8716be82c3f325a9d1afcad6a890e6cd16ad8142e3"} Apr 18 02:58:22.986655 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:22.986619 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" event={"ID":"eca800fe-1b80-453a-9322-748b5d7f9e3d","Type":"ContainerStarted","Data":"352b1a6201773bd3085b9b2962bdf710e94b536014fb93b65ba1805678e0b976"} Apr 18 02:58:22.987097 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:22.986845 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 02:58:23.004652 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:23.004606 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" podStartSLOduration=7.839114786 podStartE2EDuration="8.004592782s" podCreationTimestamp="2026-04-18 02:58:15 +0000 UTC" firstStartedPulling="2026-04-18 02:58:21.981308872 +0000 UTC m=+745.121160185" lastFinishedPulling="2026-04-18 02:58:22.146786866 +0000 UTC m=+745.286638181" observedRunningTime="2026-04-18 02:58:23.002703737 +0000 UTC m=+746.142555071" watchObservedRunningTime="2026-04-18 02:58:23.004592782 +0000 UTC m=+746.144444117" Apr 18 02:58:34.004686 ip-10-0-128-79 kubenswrapper[2577]: I0418 02:58:34.004656 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl" Apr 18 03:00:32.278206 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.278175 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 03:00:32.278773 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.278402 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7dbc9c957b-7447h" podUID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" containerName="manager" containerID="cri-o://06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf" gracePeriod=10 Apr 18 03:00:32.518752 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.518732 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 03:00:32.526067 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.526038 2577 generic.go:358] "Generic (PLEG): container finished" podID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" containerID="06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf" exitCode=0 Apr 18 03:00:32.526175 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.526096 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-7447h" Apr 18 03:00:32.526175 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.526115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-7447h" event={"ID":"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b","Type":"ContainerDied","Data":"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf"} Apr 18 03:00:32.526175 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.526151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-7447h" event={"ID":"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b","Type":"ContainerDied","Data":"3d85ca8d2031fc3856178b38ceb38c2eb63aab1a2a22d42066403627dffde870"} Apr 18 03:00:32.526175 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.526168 2577 scope.go:117] "RemoveContainer" containerID="06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf" Apr 18 03:00:32.535305 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.535270 2577 scope.go:117] "RemoveContainer" containerID="06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf" Apr 18 03:00:32.535595 ip-10-0-128-79 kubenswrapper[2577]: E0418 03:00:32.535569 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf\": container with ID starting with 06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf not found: ID does not exist" containerID="06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf" Apr 18 03:00:32.535715 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.535600 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf"} err="failed to get container status \"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf\": rpc error: code = NotFound desc = could not find container \"06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf\": container with ID starting with 06f72c1ac6a69bc0b574ae0756652d750350352713510134b00ab241b6d4ccaf not found: ID does not exist" Apr 18 03:00:32.600041 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.600016 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkrdh\" (UniqueName: \"kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh\") pod \"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b\" (UID: \"9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b\") " Apr 18 03:00:32.602006 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.601974 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh" (OuterVolumeSpecName: "kube-api-access-pkrdh") pod "9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" (UID: "9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b"). InnerVolumeSpecName "kube-api-access-pkrdh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 03:00:32.700939 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.700906 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pkrdh\" (UniqueName: \"kubernetes.io/projected/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b-kube-api-access-pkrdh\") on node \"ip-10-0-128-79.ec2.internal\" DevicePath \"\"" Apr 18 03:00:32.848296 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.848263 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 03:00:32.852381 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:32.852349 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-7447h"] Apr 18 03:00:33.362987 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.362953 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" path="/var/lib/kubelet/pods/9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b/volumes" Apr 18 03:00:33.567407 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.567374 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-9mzz9"] Apr 18 03:00:33.567789 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.567776 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" containerName="manager" Apr 18 03:00:33.567843 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.567793 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" containerName="manager" Apr 18 03:00:33.567891 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.567881 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e0c4bc6-d1eb-47fa-92b1-630f7c0ce58b" containerName="manager" Apr 18 03:00:33.572436 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.572417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:33.575713 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.575689 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6qtxj\"" Apr 18 03:00:33.579074 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.579050 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-9mzz9"] Apr 18 03:00:33.711753 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.711728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwwl\" (UniqueName: \"kubernetes.io/projected/cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8-kube-api-access-5cwwl\") pod \"maas-controller-7dbc9c957b-9mzz9\" (UID: \"cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8\") " pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:33.812512 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.812480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwwl\" (UniqueName: \"kubernetes.io/projected/cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8-kube-api-access-5cwwl\") pod \"maas-controller-7dbc9c957b-9mzz9\" (UID: \"cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8\") " pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:33.820098 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.820071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwwl\" (UniqueName: \"kubernetes.io/projected/cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8-kube-api-access-5cwwl\") pod \"maas-controller-7dbc9c957b-9mzz9\" (UID: \"cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8\") " pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:33.883858 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:33.883835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:34.005204 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:34.005179 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbc9c957b-9mzz9"] Apr 18 03:00:34.006621 ip-10-0-128-79 kubenswrapper[2577]: W0418 03:00:34.006592 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf2ed7e_d4b9_4e1a_8f9a_ab65766dc2c8.slice/crio-0957af1a1ce8aa173482ded6bc7bf733a6f412dc04846e51c940fe6d3be57d83 WatchSource:0}: Error finding container 0957af1a1ce8aa173482ded6bc7bf733a6f412dc04846e51c940fe6d3be57d83: Status 404 returned error can't find the container with id 0957af1a1ce8aa173482ded6bc7bf733a6f412dc04846e51c940fe6d3be57d83 Apr 18 03:00:34.542170 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:34.542127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" event={"ID":"cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8","Type":"ContainerStarted","Data":"e11eb3fae7fe9c3f3574ce2155765fa5026390664280d744a34989ac4a4fa047"} Apr 18 03:00:34.542170 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:34.542163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" event={"ID":"cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8","Type":"ContainerStarted","Data":"0957af1a1ce8aa173482ded6bc7bf733a6f412dc04846e51c940fe6d3be57d83"} Apr 18 03:00:34.542604 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:34.542430 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:34.557883 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:34.557835 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" podStartSLOduration=1.198090689 podStartE2EDuration="1.557823468s" podCreationTimestamp="2026-04-18 03:00:33 +0000 UTC" firstStartedPulling="2026-04-18 03:00:34.007947372 +0000 UTC m=+877.147798685" lastFinishedPulling="2026-04-18 03:00:34.367680152 +0000 UTC m=+877.507531464" observedRunningTime="2026-04-18 03:00:34.555699151 +0000 UTC m=+877.695550486" watchObservedRunningTime="2026-04-18 03:00:34.557823468 +0000 UTC m=+877.697674803" Apr 18 03:00:45.551515 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:45.551445 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7dbc9c957b-9mzz9" Apr 18 03:00:57.332636 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:57.332610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:00:57.334722 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:57.334699 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:00:57.338389 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:57.337792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:00:57.340780 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:00:57.340762 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:05:57.385307 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:05:57.385276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:05:57.385892 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:05:57.385365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:05:57.389654 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:05:57.389634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:05:57.389778 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:05:57.389703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:10:57.418021 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:10:57.417993 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:10:57.420316 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:10:57.420292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:10:57.421452 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:10:57.421435 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:10:57.423982 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:10:57.423962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:15:57.452489 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:15:57.452445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:15:57.456564 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:15:57.456525 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:15:57.458335 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:15:57.458311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:15:57.461759 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:15:57.461741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:20:57.488967 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:20:57.488939 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:20:57.493100 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:20:57.493076 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:20:57.496491 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:20:57.496471 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:20:57.499805 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:20:57.499787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:21:26.961041 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:26.961010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-82x78_b1c32584-1033-4e0f-a731-86a24df69910/manager/0.log" Apr 18 03:21:27.081223 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:27.081193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-f4b88597d-jw69s_d64cc1eb-aa61-49e6-9d51-529291d5c5e5/maas-api/0.log" Apr 18 03:21:27.197821 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:27.197798 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7dbc9c957b-9mzz9_cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8/manager/0.log" Apr 18 03:21:27.313717 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:27.313687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-djdc9_0f7c1882-395f-4300-aeaa-4c83728c6e2e/manager/2.log" Apr 18 03:21:27.686067 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:27.685958 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-x5vn4_f8222645-c589-4075-9ff6-ddaaaa73ed9f/manager/0.log" Apr 18 03:21:28.545916 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.545886 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/extract/0.log" Apr 18 03:21:28.551719 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.551696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/util/0.log" Apr 18 03:21:28.557688 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.557669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/pull/0.log" Apr 18 03:21:28.671034 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.671008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/util/0.log" Apr 18 03:21:28.677087 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.677047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/pull/0.log" Apr 18 03:21:28.683108 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.683091 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/extract/0.log" Apr 18 03:21:28.788274 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.788249 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/util/0.log" Apr 18 03:21:28.794329 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.794311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/pull/0.log" Apr 18 03:21:28.800389 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.800327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/extract/0.log" Apr 18 03:21:28.910763 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.910741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/util/0.log" Apr 18 03:21:28.922425 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.922404 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/pull/0.log" Apr 18 03:21:28.928297 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:28.928280 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/extract/0.log" Apr 18 03:21:29.169163 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:29.169102 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dbbvs_16b1cf91-83fa-4379-93a6-1e808fa66a29/manager/0.log" Apr 18 03:21:29.276592 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:29.276574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-b86bj_4abf3863-834d-4961-bcd8-40633bf2747f/manager/0.log" Apr 18 03:21:30.198107 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:30.198078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fm44jh_1062f7de-5f7a-4797-a63a-e6799379b8fc/istio-proxy/0.log" Apr 18 03:21:30.308492 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:30.308466 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mnlc5_2f2f0e0a-1564-4e11-9483-0870a4b1f8f2/discovery/0.log" Apr 18 03:21:30.413707 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:30.413686 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f769b49f-b22gm_9456592c-61cf-4a59-820e-7c061a014993/kube-auth-proxy/0.log" Apr 18 03:21:30.642109 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:30.642070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-nchxm_e3ce470d-8790-41a4-9bbe-2a771cb5191c/istio-proxy/0.log" Apr 18 03:21:30.754637 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:30.754610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-87df6b698-c7764_804c2955-592f-4663-b301-f7f6e6d14909/router/0.log" Apr 18 03:21:31.089512 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.089484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc_220fdfa4-c89d-450a-8399-66a73c0599b5/storage-initializer/0.log" Apr 18 03:21:31.096578 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.096534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-b8hxc_220fdfa4-c89d-450a-8399-66a73c0599b5/main/0.log" Apr 18 03:21:31.327837 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.327812 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-mzxhj_c72afa4c-f326-44c2-8451-e605fb9dbbda/storage-initializer/0.log" Apr 18 03:21:31.335323 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.335306 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-mzxhj_c72afa4c-f326-44c2-8451-e605fb9dbbda/main/0.log" Apr 18 03:21:31.448480 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.448418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl_eca800fe-1b80-453a-9322-748b5d7f9e3d/storage-initializer/0.log" Apr 18 03:21:31.458999 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:31.458974 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc46xdl_eca800fe-1b80-453a-9322-748b5d7f9e3d/main/0.log" Apr 18 03:21:38.283022 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:38.282992 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7fmtp_b37c426b-d777-4a11-9630-cc3f589672b0/global-pull-secret-syncer/0.log" Apr 18 03:21:38.388321 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:38.388291 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4x4ql_1cbb8019-14fc-48b5-b072-319e2f45207e/konnectivity-agent/0.log" Apr 18 03:21:38.552700 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:38.552629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-79.ec2.internal_5f163ec50d423dcd51089184ab62a7d6/haproxy/0.log" Apr 18 03:21:42.477876 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.477845 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/extract/0.log" Apr 18 03:21:42.499152 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.499126 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/util/0.log" Apr 18 03:21:42.519173 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.519156 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592t45r_889e9332-e348-4ec0-8b9e-c4160a045d91/pull/0.log" Apr 18 03:21:42.546801 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.546783 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/extract/0.log" Apr 18 03:21:42.570944 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.570924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/util/0.log" Apr 18 03:21:42.590838 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.590818 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e069xdg_2527da1f-3774-46d7-8048-e176c1e9e774/pull/0.log" Apr 18 03:21:42.619427 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.619394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/extract/0.log" Apr 18 03:21:42.641224 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.641199 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/util/0.log" Apr 18 03:21:42.665205 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.665189 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73p9tlb_225bddc0-d5f4-4ba2-8605-d8b5a5825f9a/pull/0.log" Apr 18 03:21:42.692990 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.692971 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/extract/0.log" Apr 18 03:21:42.712729 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.712714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/util/0.log" Apr 18 03:21:42.732604 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:42.732576 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14xs8z_ee130515-6cbc-4c3b-aa7b-b4df838191b7/pull/0.log" Apr 18 03:21:43.105012 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:43.104940 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dbbvs_16b1cf91-83fa-4379-93a6-1e808fa66a29/manager/0.log" Apr 18 03:21:43.129572 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:43.129445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-b86bj_4abf3863-834d-4961-bcd8-40633bf2747f/manager/0.log" Apr 18 03:21:45.001633 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.001593 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-98wv9_d50fae41-d0e5-4ab2-8306-73f4036be5d1/cluster-monitoring-operator/0.log" Apr 18 03:21:45.105799 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.105773 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-757669dd5c-9qwfl_6ceb01e6-2764-48f1-8ea7-cfec3b7e935b/metrics-server/0.log" Apr 18 03:21:45.161287 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.161265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxw6w_9dc12d5a-da8e-40a0-8c4f-d989f722e15e/node-exporter/0.log" Apr 18 03:21:45.181354 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.181334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxw6w_9dc12d5a-da8e-40a0-8c4f-d989f722e15e/kube-rbac-proxy/0.log" Apr 18 03:21:45.205537 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.205521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxw6w_9dc12d5a-da8e-40a0-8c4f-d989f722e15e/init-textfile/0.log" Apr 18 03:21:45.653676 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.653646 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-crfwl_0ed4e54a-7df6-451d-81f0-1d54c83a76f1/prometheus-operator/0.log" Apr 18 03:21:45.677711 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.677685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-crfwl_0ed4e54a-7df6-451d-81f0-1d54c83a76f1/kube-rbac-proxy/0.log" Apr 18 03:21:45.698196 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:45.698173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-f89tq_c5d30379-95e3-4c98-b645-bda215d9fed8/prometheus-operator-admission-webhook/0.log" Apr 18 03:21:46.533915 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.533879 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-276px"] Apr 18 03:21:46.537696 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.537676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.540136 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.540116 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"kube-root-ca.crt\"" Apr 18 03:21:46.541228 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.541204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8bclg\"/\"default-dockercfg-mpz9c\"" Apr 18 03:21:46.541337 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.541250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"openshift-service-ca.crt\"" Apr 18 03:21:46.546614 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.546593 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-276px"] Apr 18 03:21:46.709848 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.709813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nssl\" (UniqueName: \"kubernetes.io/projected/473cc886-cb7a-49ef-adcf-4990dd0b604b-kube-api-access-4nssl\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.710018 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.709872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-proc\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.710018 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.709962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-lib-modules\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.710018 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.709986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-podres\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.710142 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.710049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-sys\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811444 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nssl\" (UniqueName: \"kubernetes.io/projected/473cc886-cb7a-49ef-adcf-4990dd0b604b-kube-api-access-4nssl\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811444 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-proc\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-lib-modules\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-podres\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-proc\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-sys\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-sys\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-lib-modules\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.811690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.811660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/473cc886-cb7a-49ef-adcf-4990dd0b604b-podres\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.818645 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.818616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nssl\" (UniqueName: \"kubernetes.io/projected/473cc886-cb7a-49ef-adcf-4990dd0b604b-kube-api-access-4nssl\") pod \"perf-node-gather-daemonset-276px\" (UID: \"473cc886-cb7a-49ef-adcf-4990dd0b604b\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.848823 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.848803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:46.973956 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.973920 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-276px"] Apr 18 03:21:46.979918 ip-10-0-128-79 kubenswrapper[2577]: W0418 03:21:46.979888 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod473cc886_cb7a_49ef_adcf_4990dd0b604b.slice/crio-3a1d4916f9f3eaaf5e9b71a270704df89dc180e8f9b882166deede4dab08240c WatchSource:0}: Error finding container 3a1d4916f9f3eaaf5e9b71a270704df89dc180e8f9b882166deede4dab08240c: Status 404 returned error can't find the container with id 3a1d4916f9f3eaaf5e9b71a270704df89dc180e8f9b882166deede4dab08240c Apr 18 03:21:46.981596 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:46.981576 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 03:21:47.473750 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.473713 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/2.log" Apr 18 03:21:47.477709 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.477683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sq72w_34bc19f4-315e-4021-baf3-35c6d4a2d5d8/console-operator/3.log" Apr 18 03:21:47.637662 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.637635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" event={"ID":"473cc886-cb7a-49ef-adcf-4990dd0b604b","Type":"ContainerStarted","Data":"0abd3e53af6ccbb15f4973d77546998bb0f425615bf3e94c9ec858dcf9750cec"} Apr 18 03:21:47.637662 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.637666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" event={"ID":"473cc886-cb7a-49ef-adcf-4990dd0b604b","Type":"ContainerStarted","Data":"3a1d4916f9f3eaaf5e9b71a270704df89dc180e8f9b882166deede4dab08240c"} Apr 18 03:21:47.638026 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.637695 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:47.652992 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.652950 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" podStartSLOduration=1.65293925 podStartE2EDuration="1.65293925s" podCreationTimestamp="2026-04-18 03:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 03:21:47.651047754 +0000 UTC m=+2150.790899089" watchObservedRunningTime="2026-04-18 03:21:47.65293925 +0000 UTC m=+2150.792790585" Apr 18 03:21:47.914564 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:47.914525 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847dcd4d56-24mgv_7d4485fc-583f-4051-97ef-17721830f3e0/console/0.log" Apr 18 03:21:48.414706 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:48.414681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nprbc_581ba2b4-298f-4936-b05d-4ef4efbe33ad/volume-data-source-validator/0.log" Apr 18 03:21:49.117586 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:49.117537 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9l4pp_87aa1c86-8143-4fdc-b899-17184a387dcf/dns/0.log" Apr 18 03:21:49.138034 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:49.138014 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9l4pp_87aa1c86-8143-4fdc-b899-17184a387dcf/kube-rbac-proxy/0.log" Apr 18 03:21:49.283606 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:49.283579 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qvkw2_525e6e89-8bf5-472a-bde7-bfb1254515af/dns-node-resolver/0.log" Apr 18 03:21:49.747835 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:49.747797 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cn84g_d48f7502-3de3-4ca9-92d5-5eaf5e999c97/node-ca/0.log" Apr 18 03:21:50.531194 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:50.531149 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fm44jh_1062f7de-5f7a-4797-a63a-e6799379b8fc/istio-proxy/0.log" Apr 18 03:21:50.625674 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:50.625642 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mnlc5_2f2f0e0a-1564-4e11-9483-0870a4b1f8f2/discovery/0.log" Apr 18 03:21:50.646500 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:50.646460 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f769b49f-b22gm_9456592c-61cf-4a59-820e-7c061a014993/kube-auth-proxy/0.log" Apr 18 03:21:50.720245 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:50.720214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-nchxm_e3ce470d-8790-41a4-9bbe-2a771cb5191c/istio-proxy/0.log" Apr 18 03:21:50.739333 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:50.739314 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-87df6b698-c7764_804c2955-592f-4663-b301-f7f6e6d14909/router/0.log" Apr 18 03:21:51.257206 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.257177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ms6tq_17f79784-a585-4ce4-ae11-e420d136c2d0/serve-healthcheck-canary/0.log" Apr 18 03:21:51.686617 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.686524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-25mgk_7920e7d2-3418-454b-9269-8f13a0c96d2d/insights-operator/0.log" Apr 18 03:21:51.687022 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.686961 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-25mgk_7920e7d2-3418-454b-9269-8f13a0c96d2d/insights-operator/1.log" Apr 18 03:21:51.704498 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.704474 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5hpwv_483d06b1-71a6-40dd-b533-c804d4e50b45/kube-rbac-proxy/0.log" Apr 18 03:21:51.722466 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.722443 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5hpwv_483d06b1-71a6-40dd-b533-c804d4e50b45/exporter/0.log" Apr 18 03:21:51.742726 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:51.742708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5hpwv_483d06b1-71a6-40dd-b533-c804d4e50b45/extractor/0.log" Apr 18 03:21:53.651124 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.651096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-276px" Apr 18 03:21:53.664138 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.664116 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-82x78_b1c32584-1033-4e0f-a731-86a24df69910/manager/0.log" Apr 18 03:21:53.693469 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.693444 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-f4b88597d-jw69s_d64cc1eb-aa61-49e6-9d51-529291d5c5e5/maas-api/0.log" Apr 18 03:21:53.739110 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.739082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7dbc9c957b-9mzz9_cbf2ed7e-d4b9-4e1a-8f9a-ab65766dc2c8/manager/0.log" Apr 18 03:21:53.756788 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.756769 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-djdc9_0f7c1882-395f-4300-aeaa-4c83728c6e2e/manager/1.log" Apr 18 03:21:53.767547 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.767526 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-djdc9_0f7c1882-395f-4300-aeaa-4c83728c6e2e/manager/2.log" Apr 18 03:21:53.853000 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:53.852976 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-x5vn4_f8222645-c589-4075-9ff6-ddaaaa73ed9f/manager/0.log" Apr 18 03:21:55.149882 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:21:55.149854 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-9mz7z_96520594-5eda-4ca9-a837-b81f770fafd1/openshift-lws-operator/0.log" Apr 18 03:22:00.618337 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.618310 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9ffdc_1f8ead79-25ed-4501-ab2c-99de1d600ce7/kube-multus/0.log" Apr 18 03:22:00.791477 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.791445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/kube-multus-additional-cni-plugins/0.log" Apr 18 03:22:00.809850 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.809826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/egress-router-binary-copy/0.log" Apr 18 03:22:00.827308 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.827286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/cni-plugins/0.log" Apr 18 03:22:00.849728 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.849706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/bond-cni-plugin/0.log" Apr 18 03:22:00.868106 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.868087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/routeoverride-cni/0.log" Apr 18 03:22:00.886160 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.886110 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/whereabouts-cni-bincopy/0.log" Apr 18 03:22:00.905227 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:00.905207 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dd75m_200381f5-de50-4d9c-ba7d-aac4abdd4c3d/whereabouts-cni/0.log" Apr 18 03:22:01.113047 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:01.113019 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6xc88_64806518-b360-4104-92e5-8a3017ab382a/network-metrics-daemon/0.log" Apr 18 03:22:01.131517 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:01.131492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6xc88_64806518-b360-4104-92e5-8a3017ab382a/kube-rbac-proxy/0.log" Apr 18 03:22:01.999021 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:01.998993 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-controller/0.log" Apr 18 03:22:02.014620 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.014599 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/0.log" Apr 18 03:22:02.024386 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.024366 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovn-acl-logging/1.log" Apr 18 03:22:02.043173 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.043154 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/kube-rbac-proxy-node/0.log" Apr 18 03:22:02.067670 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.067650 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/kube-rbac-proxy-ovn-metrics/0.log" Apr 18 03:22:02.086260 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.086243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/northd/0.log" Apr 18 03:22:02.106738 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.106715 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/nbdb/0.log" Apr 18 03:22:02.124690 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.124672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/sbdb/0.log" Apr 18 03:22:02.238039 ip-10-0-128-79 kubenswrapper[2577]: I0418 03:22:02.238004 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cf2h2_20075b24-809d-40f9-8a39-d31291dbdc96/ovnkube-controller/0.log"