Apr 17 15:17:06.507594 ip-10-0-130-92 systemd[1]: Starting Kubernetes Kubelet... Apr 17 15:17:06.872573 ip-10-0-130-92 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:06.872573 ip-10-0-130-92 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 15:17:06.872573 ip-10-0-130-92 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:06.872573 ip-10-0-130-92 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 15:17:06.872573 ip-10-0-130-92 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:06.874040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.873958 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 15:17:06.878106 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878080 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:06.878106 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878101 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:06.878106 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878106 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:06.878106 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878110 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:06.878106 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878114 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878120 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878124 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878127 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878131 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878135 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878139 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878143 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878147 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878150 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878154 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878158 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878161 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878165 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878168 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878172 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878176 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878179 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878183 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878187 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:06.878427 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878196 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878200 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878204 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878207 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878211 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878214 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878218 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878221 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878225 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878230 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878233 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878239 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878247 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878254 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878259 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878263 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878268 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878273 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878278 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:06.879223 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878282 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878287 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878291 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878296 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878300 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878305 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878326 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878331 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878338 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878343 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878348 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878352 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878356 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878360 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878364 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878368 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878372 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878377 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878381 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878386 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:06.879920 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878391 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878394 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878399 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878403 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878407 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878412 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878417 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878421 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878426 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878430 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878434 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878438 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878442 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878446 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878450 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878456 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878460 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878464 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878469 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878472 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:06.880528 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878477 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878481 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.878485 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879126 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879137 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879142 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879146 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879153 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879159 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879164 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879169 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879173 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879177 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879181 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879185 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879190 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879194 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879223 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879230 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879235 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:06.881302 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879240 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879244 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879248 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879253 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879258 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879262 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879267 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879273 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879278 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879284 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879288 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879293 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879298 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879302 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879324 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879332 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879336 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879340 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879345 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879349 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:06.882173 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879353 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879359 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879364 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879369 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879373 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879377 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879382 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879387 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879391 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879395 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879400 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879404 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879408 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879412 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879417 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879421 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879425 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879429 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879433 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:06.882690 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879438 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879443 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879447 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879451 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879455 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879461 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879466 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879470 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879475 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879479 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879483 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879487 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879491 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879495 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879498 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879502 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879506 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879510 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879515 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879521 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:06.883244 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879531 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879536 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879540 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879544 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879548 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879552 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879556 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879560 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879564 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.879569 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880718 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880733 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880745 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880752 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880760 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880765 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880772 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880779 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880784 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880790 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880796 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 15:17:06.883989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880801 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880806 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880812 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880817 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880822 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880827 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880831 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880836 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880842 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880847 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880852 2567 flags.go:64] FLAG: --config-dir="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880857 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880862 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880869 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880874 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880879 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880884 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880889 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880894 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880899 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880904 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880909 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880916 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880921 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880925 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 15:17:06.884545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880930 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880936 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880940 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880948 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880953 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880960 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880965 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880970 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880977 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880982 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880987 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880992 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.880997 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881001 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881006 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881010 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881015 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881019 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881024 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881031 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881035 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881041 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881046 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881051 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881056 2567 flags.go:64] FLAG: --help="false" Apr 17 15:17:06.885232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881060 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-130-92.ec2.internal" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881066 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881071 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881075 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881081 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881087 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881092 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881096 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881101 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881106 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881111 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881117 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881123 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881128 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881132 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881137 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881141 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881146 2567 flags.go:64] FLAG: --lock-file="" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881151 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881156 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881161 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881169 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881174 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881178 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 15:17:06.885865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881183 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881188 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881194 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881199 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881203 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881210 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881215 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881222 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881228 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881232 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881237 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881242 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881247 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881251 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881256 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881268 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881273 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881278 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881283 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881289 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881297 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881302 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881323 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881328 2567 flags.go:64] FLAG: --port="10250" Apr 17 15:17:06.886490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881333 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881338 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a179d5d2ee086f57" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881343 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881347 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881352 2567 flags.go:64] FLAG: --register-node="true" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881357 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881362 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881368 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881372 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881377 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881381 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881387 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881392 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881397 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881402 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881407 2567 flags.go:64] FLAG: --runonce="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881411 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881416 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881421 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881426 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881431 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881436 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881440 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881445 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881451 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881456 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 15:17:06.887080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881460 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881466 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881474 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881479 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881483 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881493 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881498 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881502 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881509 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881513 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881518 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881523 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881527 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881534 2567 flags.go:64] FLAG: --v="2" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881542 2567 flags.go:64] FLAG: --version="false" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881549 2567 flags.go:64] FLAG: --vmodule="" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881559 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.881564 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881717 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881724 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881729 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881734 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881738 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:06.887744 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881743 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881747 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881752 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881756 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881763 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881768 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881773 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881778 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881784 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881790 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881795 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881801 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881806 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881811 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881816 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881821 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881826 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881830 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881835 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:06.888367 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881839 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881843 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881847 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881853 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881858 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881862 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881866 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881871 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881875 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881879 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881883 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881887 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881892 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881896 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881900 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881905 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881909 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881913 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881918 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:06.888842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881922 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881926 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881930 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881934 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881937 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881946 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881952 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881956 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881960 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881965 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881969 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881973 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881977 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881982 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881986 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881990 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.881996 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882001 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882005 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882009 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:06.889326 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882013 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882017 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882021 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882025 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882030 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882034 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882038 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882043 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882047 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882051 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882055 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882059 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882062 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882067 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882070 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882075 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882079 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882085 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882089 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882094 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:06.889816 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882098 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882101 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.882105 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.882777 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.889125 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.889140 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889186 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889191 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889195 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889197 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889200 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889203 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889206 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889208 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889211 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889214 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:06.890296 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889217 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889219 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889222 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889224 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889227 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889230 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889233 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889235 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889240 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889244 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889248 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889251 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889253 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889256 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889258 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889261 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889264 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889267 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889272 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:06.890729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889275 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889278 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889282 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889285 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889287 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889290 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889293 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889295 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889298 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889300 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889303 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889306 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889321 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889324 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889327 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889329 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889332 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889335 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889338 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889341 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:06.891225 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889343 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889346 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889349 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889351 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889354 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889357 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889359 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889362 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889364 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889367 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889369 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889372 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889374 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889377 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889380 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889383 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889386 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889388 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889391 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889394 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:06.891733 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889396 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889399 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889401 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889404 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889407 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889410 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889412 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889415 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889417 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889420 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889423 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889425 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889428 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889430 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889433 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889435 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:06.892224 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889438 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.889443 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889564 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889569 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889572 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889575 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889578 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889581 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889584 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889587 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889590 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889593 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889596 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889599 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889602 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:06.892634 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889605 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889607 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889610 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889612 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889615 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889617 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889620 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889622 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889625 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889627 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889630 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889632 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889635 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889637 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889640 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889642 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889645 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889647 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889650 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889653 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:06.893004 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889656 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889659 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889661 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889664 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889666 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889669 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889671 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889674 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889676 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889679 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889682 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889685 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889687 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889690 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889692 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889695 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889697 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889700 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889703 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889705 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:06.893510 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889708 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889710 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889712 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889715 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889718 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889720 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889723 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889725 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889727 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889730 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889732 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889735 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889737 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889739 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889742 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889744 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889747 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889749 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889751 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889754 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:06.893995 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889756 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889759 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889761 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889764 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889767 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889769 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889772 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889774 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889776 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889779 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889781 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889784 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:06.889787 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.889792 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:06.894581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.889903 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 15:17:06.894927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.893704 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 15:17:06.894927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.894519 2567 server.go:1019] "Starting client certificate rotation" Apr 17 15:17:06.894927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.894613 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:06.894927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.894652 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:06.916671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.916653 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:06.919244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.919227 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:06.936624 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.936600 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 15:17:06.942051 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.942031 2567 log.go:25] "Validated CRI v1 image API" Apr 17 15:17:06.943675 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.943654 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 15:17:06.944364 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.944348 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:06.948066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.948044 2567 fs.go:135] Filesystem UUIDs: map[305b1397-aeea-4597-aec8-b677629a185e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d86116b5-455a-41ac-9028-0c0e9caaee9b:/dev/nvme0n1p3] Apr 17 15:17:06.948131 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.948063 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 15:17:06.954148 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.954039 2567 manager.go:217] Machine: {Timestamp:2026-04-17 15:17:06.952156349 +0000 UTC m=+0.343698376 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3125482 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29d4639febddfb70b2b93c11bb5a08 SystemUUID:ec29d463-9feb-ddfb-70b2-b93c11bb5a08 BootID:003b75c9-ad09-43f4-ad93-e76b4f641ede Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:24:14:c9:26:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:24:14:c9:26:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:ac:a7:2d:e0:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 15:17:06.954148 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.954144 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 15:17:06.954254 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.954224 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 15:17:06.955379 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.955353 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 15:17:06.955525 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.955380 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-92.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 15:17:06.955572 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.955536 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 15:17:06.955572 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.955545 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 15:17:06.955572 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.955562 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:06.956244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.956234 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:06.957789 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.957779 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:06.957898 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.957890 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 15:17:06.959799 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.959789 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 15:17:06.959830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.959802 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 15:17:06.959830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.959814 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 15:17:06.959830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.959825 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 15:17:06.959938 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.959834 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 15:17:06.960928 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.960916 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:06.960974 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.960935 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:06.963519 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.963503 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 15:17:06.964863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.964850 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 15:17:06.966866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966855 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966873 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966880 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966886 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966892 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966899 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 15:17:06.966907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966904 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 15:17:06.967074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966910 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 15:17:06.967074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966917 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 15:17:06.967074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966922 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 15:17:06.967074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966931 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 15:17:06.967074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.966940 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 15:17:06.967628 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.967619 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 15:17:06.967628 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.967628 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 15:17:06.971279 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.971267 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 15:17:06.971333 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.971300 2567 server.go:1295] "Started kubelet" Apr 17 15:17:06.971485 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.971426 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 15:17:06.971575 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.971505 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 15:17:06.971669 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.971586 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 15:17:06.972113 ip-10-0-130-92 systemd[1]: Started Kubernetes Kubelet. Apr 17 15:17:06.972520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.972481 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-92.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 15:17:06.972595 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.972568 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 15:17:06.972644 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.972602 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 15:17:06.972828 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.972674 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-92.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 15:17:06.975661 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.975641 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 15:17:06.978578 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.975891 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-92.ec2.internal.18a72de20fc37e1b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-92.ec2.internal,UID:ip-10-0-130-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-92.ec2.internal,},FirstTimestamp:2026-04-17 15:17:06.971278875 +0000 UTC m=+0.362820901,LastTimestamp:2026-04-17 15:17:06.971278875 +0000 UTC m=+0.362820901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-92.ec2.internal,}" Apr 17 15:17:06.979250 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.979230 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:06.979620 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.979601 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 15:17:06.980247 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980230 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 15:17:06.980398 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980379 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 15:17:06.980398 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980327 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 15:17:06.980555 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980491 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 15:17:06.980555 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980499 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 15:17:06.980694 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.980675 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:06.980813 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980793 2567 factory.go:55] Registering systemd factory Apr 17 15:17:06.980813 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.980814 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 15:17:06.981100 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981082 2567 factory.go:153] Registering CRI-O factory Apr 17 15:17:06.981162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981104 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 15:17:06.981162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981160 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 15:17:06.981242 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981188 2567 factory.go:103] Registering Raw factory Apr 17 15:17:06.981242 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981202 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 15:17:06.981805 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.981760 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 15:17:06.981896 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.981821 2567 manager.go:319] Starting recovery of all containers Apr 17 15:17:06.981978 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.981957 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-92.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 15:17:06.982031 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:06.981981 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 15:17:06.983821 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.983801 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-njcw8" Apr 17 15:17:06.991518 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.991398 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-njcw8" Apr 17 15:17:06.993895 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.993878 2567 manager.go:324] Recovery completed Apr 17 15:17:06.998688 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:06.998676 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.003384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003369 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.003452 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003397 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.003452 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003417 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.003936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003918 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 15:17:07.003936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003933 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 15:17:07.004055 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.003951 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:07.007342 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.007330 2567 policy_none.go:49] "None policy: Start" Apr 17 15:17:07.007381 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.007351 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 15:17:07.007381 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.007362 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.045876 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.045906 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.045917 2567 server.go:85] "Starting device plugin registration server" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.046138 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.046149 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.046241 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.046335 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.046344 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.046923 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 15:17:07.060727 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.046966 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.106940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.106905 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 15:17:07.108230 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.108210 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 15:17:07.108355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.108234 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 15:17:07.108355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.108256 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 15:17:07.108355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.108263 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 15:17:07.108355 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.108295 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 15:17:07.111773 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.111750 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:07.146631 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.146565 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.147546 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.147530 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.147623 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.147560 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.147623 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.147571 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.147623 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.147597 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.153654 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.153640 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.153700 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.153662 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-92.ec2.internal\": node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.165519 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.165499 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.208370 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.208347 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal"] Apr 17 15:17:07.208422 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.208410 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.209860 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.209843 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.209936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.209873 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.209936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.209884 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.212292 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.212280 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.212508 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.212495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.212544 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.212522 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.213053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213036 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.213115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213049 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.213115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213069 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.213115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213080 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.213115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213091 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.213115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.213101 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.215226 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.215211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.215278 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.215239 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:07.215957 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.215941 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:07.216020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.215973 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:07.216020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.215986 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:07.243088 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.243063 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-92.ec2.internal\" not found" node="ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.246436 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.246418 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-92.ec2.internal\" not found" node="ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.266451 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.266436 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.281137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.281118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.281198 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.281142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c8041c1c70e4c5187623822d69c931b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-92.ec2.internal\" (UID: \"8c8041c1c70e4c5187623822d69c931b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.281198 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.281161 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.366745 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.366714 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.382000 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.381978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.382105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.382006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c8041c1c70e4c5187623822d69c931b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-92.ec2.internal\" (UID: \"8c8041c1c70e4c5187623822d69c931b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.382105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.382025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.382105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.382065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.382105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.382053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71e821ae6b8df1a5a1fd11ea19a5e77f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal\" (UID: \"71e821ae6b8df1a5a1fd11ea19a5e77f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.382246 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.382077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c8041c1c70e4c5187623822d69c931b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-92.ec2.internal\" (UID: \"8c8041c1c70e4c5187623822d69c931b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.467467 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.467389 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.544890 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.544855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.549418 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.549399 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:07.568075 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.568051 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.668615 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.668583 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.769128 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.769057 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.869679 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.869651 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.894116 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.894091 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 15:17:07.894634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.894254 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:07.970698 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:07.970668 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:07.979670 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.979648 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:07.992248 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.992226 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:07.994114 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.994093 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 15:12:06 +0000 UTC" deadline="2027-11-12 15:36:49.786794843 +0000 UTC" Apr 17 15:17:07.994176 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:07.994115 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13776h19m41.792683092s" Apr 17 15:17:08.014269 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.014244 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-knhdh" Apr 17 15:17:08.023278 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.023233 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-knhdh" Apr 17 15:17:08.071719 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:08.071697 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:08.099715 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:08.099688 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e821ae6b8df1a5a1fd11ea19a5e77f.slice/crio-f314e4fd348e0b0d65bb6e13ee94ac62ee814dc255782995f1a3bf92b9bf59ce WatchSource:0}: Error finding container f314e4fd348e0b0d65bb6e13ee94ac62ee814dc255782995f1a3bf92b9bf59ce: Status 404 returned error can't find the container with id f314e4fd348e0b0d65bb6e13ee94ac62ee814dc255782995f1a3bf92b9bf59ce Apr 17 15:17:08.100072 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:08.100048 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8041c1c70e4c5187623822d69c931b.slice/crio-804aff81787f0c7dcc0bf357f4b980848fafde3505b2b817176f8220edff9862 WatchSource:0}: Error finding container 804aff81787f0c7dcc0bf357f4b980848fafde3505b2b817176f8220edff9862: Status 404 returned error can't find the container with id 804aff81787f0c7dcc0bf357f4b980848fafde3505b2b817176f8220edff9862 Apr 17 15:17:08.103589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.103575 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:17:08.111024 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.110986 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" event={"ID":"71e821ae6b8df1a5a1fd11ea19a5e77f","Type":"ContainerStarted","Data":"f314e4fd348e0b0d65bb6e13ee94ac62ee814dc255782995f1a3bf92b9bf59ce"} Apr 17 15:17:08.111982 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.111961 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" event={"ID":"8c8041c1c70e4c5187623822d69c931b","Type":"ContainerStarted","Data":"804aff81787f0c7dcc0bf357f4b980848fafde3505b2b817176f8220edff9862"} Apr 17 15:17:08.172400 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:08.172376 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-92.ec2.internal\" not found" Apr 17 15:17:08.204841 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.204821 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:08.224212 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.224190 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:08.281082 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.281013 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" Apr 17 15:17:08.297396 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.297365 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:08.298858 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.298424 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" Apr 17 15:17:08.305524 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.305503 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:08.306023 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.306010 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:08.961083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.961046 2567 apiserver.go:52] "Watching apiserver" Apr 17 15:17:08.969531 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.969509 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 15:17:08.969907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.969884 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal","openshift-cluster-node-tuning-operator/tuned-dxkdc","openshift-image-registry/node-ca-4n2f7","openshift-multus/multus-additional-cni-plugins-fhh97","openshift-multus/multus-wn95c","openshift-network-operator/iptables-alerter-27fdl","openshift-ovn-kubernetes/ovnkube-node-rphqm","kube-system/konnectivity-agent-mxqss","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x","openshift-dns/node-resolver-r2ffn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal","openshift-multus/network-metrics-daemon-n8fjz","openshift-network-diagnostics/network-check-target-6wr6j"] Apr 17 15:17:08.972947 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.972927 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.975966 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.975945 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.976975 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.976937 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 15:17:08.977083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.976976 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.977083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.976990 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.977083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.977003 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr6fw\"" Apr 17 15:17:08.977083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.977020 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 15:17:08.977332 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.977129 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 15:17:08.977332 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.977146 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 15:17:08.978016 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.977997 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.978410 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.978387 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mwlrr\"" Apr 17 15:17:08.978491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.978463 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.978593 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.978575 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:08.980550 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.980530 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 15:17:08.980672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.980547 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.981188 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.981002 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-f72dc\"" Apr 17 15:17:08.981188 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.981070 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.981188 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.981141 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.983421 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.983404 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wn95c" Apr 17 15:17:08.984651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984061 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.984651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984142 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 15:17:08.984651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984331 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpt76\"" Apr 17 15:17:08.984651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984567 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.984919 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984817 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 15:17:08.984919 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.984875 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 15:17:08.987055 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.985957 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5mqj\"" Apr 17 15:17:08.988060 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.988007 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 15:17:08.988285 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.988263 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:08.989806 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989789 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysconfig\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.989899 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-lib-modules\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.989899 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-systemd-units\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.989991 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989902 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-slash\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.989991 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.989991 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989968 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-netd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.989994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-config\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-env-overrides\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-conf\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.990119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f42d472-7136-4d33-b081-4e8ae758480e-host\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:08.990119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-system-cni-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-var-lib-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-log-socket\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990180 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-kubernetes\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-run\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-sys\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-bin\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovn-node-metrics-cert\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f42d472-7136-4d33-b081-4e8ae758480e-serviceca\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6spw6\" (UniqueName: \"kubernetes.io/projected/1f42d472-7136-4d33-b081-4e8ae758480e-kube-api-access-6spw6\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-os-release\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990466 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990493 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt8t\" (UniqueName: \"kubernetes.io/projected/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-kube-api-access-kvt8t\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-ovn\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990561 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-node-log\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990612 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-netns\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.990634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990632 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990741 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990852 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-systemd\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-tuned\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990927 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzkwt\" (UniqueName: \"kubernetes.io/projected/14d75cd8-6c31-4216-a34f-742e9cc2a898-kube-api-access-bzkwt\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.990951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-kubelet\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991056 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-script-lib\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-modprobe-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991129 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lstsb\"" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-var-lib-kubelet\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-host\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-etc-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-tmp\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cnibin\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-systemd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.991581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.991380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdl6w\" (UniqueName: \"kubernetes.io/projected/c60e7d84-f238-44b7-91dd-6bebb34d4158-kube-api-access-xdl6w\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:08.992897 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.992861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:08.993469 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.993433 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 15:17:08.993549 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.993493 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7xl9g\"" Apr 17 15:17:08.994239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.994221 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 15:17:08.995062 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.995042 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.995211 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.995193 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xm52c\"" Apr 17 15:17:08.995290 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.995245 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 15:17:08.995559 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.995542 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.997490 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.997458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:08.997609 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.997586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:08.997726 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:08.997691 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:08.999785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.999762 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 15:17:08.999912 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.999792 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 15:17:08.999912 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:08.999814 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:08.999912 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:08.999866 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:09.000124 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.000105 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-znk6p\"" Apr 17 15:17:09.023984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.023942 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:08 +0000 UTC" deadline="2027-10-12 00:57:10.415599188 +0000 UTC" Apr 17 15:17:09.024096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.024077 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13017h40m1.391527305s" Apr 17 15:17:09.081876 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.081847 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 15:17:09.091835 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-node-log\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-systemd\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-tuned\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091931 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-os-release\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-node-log\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091956 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-kubelet\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.091979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-systemd\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.091983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff92ac3-7f84-4934-b020-eda543896879-agent-certs\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092015 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-kubelet\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-kubelet\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-script-lib\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-modprobe-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092154 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-socket-dir-parent\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092193 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.092280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-etc-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092288 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff92ac3-7f84-4934-b020-eda543896879-konnectivity-ca\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-registration-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092344 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysconfig\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-systemd-units\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092364 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092384 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092410 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-netd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092434 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-conf\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysconfig\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-run\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092467 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f42d472-7136-4d33-b081-4e8ae758480e-host\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-var-lib-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-netd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-etc-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-systemd-units\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.092803 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-log-socket\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092547 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-log-socket\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-system-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092569 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-run\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f42d472-7136-4d33-b081-4e8ae758480e-host\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-bin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092622 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-multus-certs\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092629 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-var-lib-openvswitch\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-sysctl-conf\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092683 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-bin\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092744 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovn-node-metrics-cert\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-script-lib\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092789 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-hostroot\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-cni-bin\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-conf-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092866 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-daemon-config\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.093642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-etc-kubernetes\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5kg\" (UniqueName: \"kubernetes.io/projected/30857904-b776-4486-98cc-f89642587b8a-kube-api-access-2w5kg\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6spw6\" (UniqueName: \"kubernetes.io/projected/1f42d472-7136-4d33-b081-4e8ae758480e-kube-api-access-6spw6\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-os-release\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.092985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-modprobe-d\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093068 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-os-release\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt8t\" (UniqueName: \"kubernetes.io/projected/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-kube-api-access-kvt8t\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-ovn\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzkwt\" (UniqueName: \"kubernetes.io/projected/14d75cd8-6c31-4216-a34f-742e9cc2a898-kube-api-access-bzkwt\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093194 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-netns\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093202 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-ovn\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-netns\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57czn\" (UniqueName: \"kubernetes.io/projected/9fb501e8-358b-4ede-bb90-e53237beeef0-kube-api-access-57czn\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.094400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093303 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w2nl\" (UniqueName: \"kubernetes.io/projected/7ef14720-2641-45c2-84ff-e658786a8152-kube-api-access-8w2nl\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wz2n\" (UniqueName: \"kubernetes.io/projected/a2a09a76-6aeb-4520-9efb-287cddc7f75b-kube-api-access-6wz2n\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093404 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-var-lib-kubelet\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-host\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-socket-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-device-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-sys-fs\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093531 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-netns\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2a09a76-6aeb-4520-9efb-287cddc7f75b-hosts-file\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093570 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-tmp\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-cnibin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-multus\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cnibin\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-var-lib-kubelet\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093685 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-systemd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-run-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdl6w\" (UniqueName: \"kubernetes.io/projected/c60e7d84-f238-44b7-91dd-6bebb34d4158-kube-api-access-xdl6w\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-lib-modules\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-cnibin\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093824 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-host\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093880 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a09a76-6aeb-4520-9efb-287cddc7f75b-tmp-dir\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093888 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-run-systemd\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-lib-modules\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-slash\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.093983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-config\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-env-overrides\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60e7d84-f238-44b7-91dd-6bebb34d4158-host-slash\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094094 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-sys\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094135 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-sys\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.095751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30857904-b776-4486-98cc-f89642587b8a-host-slash\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-etc-selinux\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094350 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094390 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-system-cni-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-kubernetes\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-cni-binary-copy\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-269f8\" (UniqueName: \"kubernetes.io/projected/48990b07-a036-41ef-a6cd-89d7520c417c-kube-api-access-269f8\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-env-overrides\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-system-cni-dir\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-k8s-cni-cncf-io\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-kubernetes\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094602 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/30857904-b776-4486-98cc-f89642587b8a-iptables-alerter-script\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.094680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f42d472-7136-4d33-b081-4e8ae758480e-serviceca\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.095106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovnkube-config\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.095706 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1f42d472-7136-4d33-b081-4e8ae758480e-serviceca\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.096515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.096153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c60e7d84-f238-44b7-91dd-6bebb34d4158-ovn-node-metrics-cert\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.097333 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.096592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-etc-tuned\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.097333 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.096692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d75cd8-6c31-4216-a34f-742e9cc2a898-tmp\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.103739 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.103685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzkwt\" (UniqueName: \"kubernetes.io/projected/14d75cd8-6c31-4216-a34f-742e9cc2a898-kube-api-access-bzkwt\") pod \"tuned-dxkdc\" (UID: \"14d75cd8-6c31-4216-a34f-742e9cc2a898\") " pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.103840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.103778 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6spw6\" (UniqueName: \"kubernetes.io/projected/1f42d472-7136-4d33-b081-4e8ae758480e-kube-api-access-6spw6\") pod \"node-ca-4n2f7\" (UID: \"1f42d472-7136-4d33-b081-4e8ae758480e\") " pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.104266 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.104245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt8t\" (UniqueName: \"kubernetes.io/projected/ffaaed56-110b-4fd5-9fbe-e8e71f6de33d-kube-api-access-kvt8t\") pod \"multus-additional-cni-plugins-fhh97\" (UID: \"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d\") " pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.104440 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.104419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdl6w\" (UniqueName: \"kubernetes.io/projected/c60e7d84-f238-44b7-91dd-6bebb34d4158-kube-api-access-xdl6w\") pod \"ovnkube-node-rphqm\" (UID: \"c60e7d84-f238-44b7-91dd-6bebb34d4158\") " pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.113923 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.113905 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:09.195908 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.195877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-k8s-cni-cncf-io\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.195919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/30857904-b776-4486-98cc-f89642587b8a-iptables-alerter-script\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.196066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-k8s-cni-cncf-io\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:09.196066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-os-release\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-kubelet\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff92ac3-7f84-4934-b020-eda543896879-agent-certs\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-socket-dir-parent\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff92ac3-7f84-4934-b020-eda543896879-konnectivity-ca\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-registration-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-kubelet\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196207 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-os-release\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196211 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-system-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-bin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-socket-dir-parent\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-multus-certs\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-registration-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-hostroot\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196367 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-bin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-multus-certs\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-conf-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196371 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-system-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196410 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-daemon-config\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196414 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-hostroot\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196437 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-conf-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-etc-kubernetes\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196473 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-etc-kubernetes\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196473 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5kg\" (UniqueName: \"kubernetes.io/projected/30857904-b776-4486-98cc-f89642587b8a-kube-api-access-2w5kg\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-netns\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57czn\" (UniqueName: \"kubernetes.io/projected/9fb501e8-358b-4ede-bb90-e53237beeef0-kube-api-access-57czn\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w2nl\" (UniqueName: \"kubernetes.io/projected/7ef14720-2641-45c2-84ff-e658786a8152-kube-api-access-8w2nl\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.196817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/30857904-b776-4486-98cc-f89642587b8a-iptables-alerter-script\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wz2n\" (UniqueName: \"kubernetes.io/projected/a2a09a76-6aeb-4520-9efb-287cddc7f75b-kube-api-access-6wz2n\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-socket-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196669 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-run-netns\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196689 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-device-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-device-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-socket-dir\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-sys-fs\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196848 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2a09a76-6aeb-4520-9efb-287cddc7f75b-hosts-file\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196873 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-cnibin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196899 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-multus\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a09a76-6aeb-4520-9efb-287cddc7f75b-tmp-dir\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-sys-fs\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff92ac3-7f84-4934-b020-eda543896879-konnectivity-ca\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196962 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30857904-b776-4486-98cc-f89642587b8a-host-slash\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-cnibin\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.196996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-etc-selinux\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-daemon-config\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.197558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197017 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-host-var-lib-cni-multus\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197021 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/30857904-b776-4486-98cc-f89642587b8a-host-slash\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2a09a76-6aeb-4520-9efb-287cddc7f75b-hosts-file\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.197120 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ef14720-2641-45c2-84ff-e658786a8152-etc-selinux\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.197180 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:09.697163277 +0000 UTC m=+3.088705310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197194 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-cni-binary-copy\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-269f8\" (UniqueName: \"kubernetes.io/projected/48990b07-a036-41ef-a6cd-89d7520c417c-kube-api-access-269f8\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a09a76-6aeb-4520-9efb-287cddc7f75b-tmp-dir\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fb501e8-358b-4ede-bb90-e53237beeef0-multus-cni-dir\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.198137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.197651 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fb501e8-358b-4ede-bb90-e53237beeef0-cni-binary-copy\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.199393 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.199375 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff92ac3-7f84-4934-b020-eda543896879-agent-certs\") pod \"konnectivity-agent-mxqss\" (UID: \"eff92ac3-7f84-4934-b020-eda543896879\") " pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.202466 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.202445 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:09.202466 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.202466 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:09.202626 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.202478 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:09.202626 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.202568 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:09.70255145 +0000 UTC m=+3.094093489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:09.205403 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.205384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wz2n\" (UniqueName: \"kubernetes.io/projected/a2a09a76-6aeb-4520-9efb-287cddc7f75b-kube-api-access-6wz2n\") pod \"node-resolver-r2ffn\" (UID: \"a2a09a76-6aeb-4520-9efb-287cddc7f75b\") " pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.205483 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.205441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w2nl\" (UniqueName: \"kubernetes.io/projected/7ef14720-2641-45c2-84ff-e658786a8152-kube-api-access-8w2nl\") pod \"aws-ebs-csi-driver-node-lvc6x\" (UID: \"7ef14720-2641-45c2-84ff-e658786a8152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.205535 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.205473 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5kg\" (UniqueName: \"kubernetes.io/projected/30857904-b776-4486-98cc-f89642587b8a-kube-api-access-2w5kg\") pod \"iptables-alerter-27fdl\" (UID: \"30857904-b776-4486-98cc-f89642587b8a\") " pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.205839 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.205813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57czn\" (UniqueName: \"kubernetes.io/projected/9fb501e8-358b-4ede-bb90-e53237beeef0-kube-api-access-57czn\") pod \"multus-wn95c\" (UID: \"9fb501e8-358b-4ede-bb90-e53237beeef0\") " pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.206902 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.206884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-269f8\" (UniqueName: \"kubernetes.io/projected/48990b07-a036-41ef-a6cd-89d7520c417c-kube-api-access-269f8\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.286956 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.286856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:09.292386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.292365 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" Apr 17 15:17:09.301056 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.301029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4n2f7" Apr 17 15:17:09.306596 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.306576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fhh97" Apr 17 15:17:09.313154 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.313136 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wn95c" Apr 17 15:17:09.321676 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.321658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-27fdl" Apr 17 15:17:09.327203 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.327185 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:09.332707 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.332685 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" Apr 17 15:17:09.339213 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.339196 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r2ffn" Apr 17 15:17:09.700735 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.700665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:09.700864 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.700778 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:09.700864 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.700834 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:10.700819523 +0000 UTC m=+4.092361537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:09.726320 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.726282 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef14720_2641_45c2_84ff_e658786a8152.slice/crio-c17056d390b92e00cd68c90b936e8d07cd44d870d89a9282826356d27f93f6c4 WatchSource:0}: Error finding container c17056d390b92e00cd68c90b936e8d07cd44d870d89a9282826356d27f93f6c4: Status 404 returned error can't find the container with id c17056d390b92e00cd68c90b936e8d07cd44d870d89a9282826356d27f93f6c4 Apr 17 15:17:09.727238 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.727214 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffaaed56_110b_4fd5_9fbe_e8e71f6de33d.slice/crio-0fe13b34791fca1d497102f14f1d2bf39f5b2a71cd03d263fb6c2445228543e4 WatchSource:0}: Error finding container 0fe13b34791fca1d497102f14f1d2bf39f5b2a71cd03d263fb6c2445228543e4: Status 404 returned error can't find the container with id 0fe13b34791fca1d497102f14f1d2bf39f5b2a71cd03d263fb6c2445228543e4 Apr 17 15:17:09.728394 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.728371 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30857904_b776_4486_98cc_f89642587b8a.slice/crio-e936c8e3df58810cf30a5432132fe9ae991f2d7bd8cafffc0f9ef7a819ba70be WatchSource:0}: Error finding container e936c8e3df58810cf30a5432132fe9ae991f2d7bd8cafffc0f9ef7a819ba70be: Status 404 returned error can't find the container with id e936c8e3df58810cf30a5432132fe9ae991f2d7bd8cafffc0f9ef7a819ba70be Apr 17 15:17:09.729532 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.729512 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a09a76_6aeb_4520_9efb_287cddc7f75b.slice/crio-9ac76521f6cc5dcc9d9b9a7f6123f6acfa23c15306794f63d675ce4c07e5f91f WatchSource:0}: Error finding container 9ac76521f6cc5dcc9d9b9a7f6123f6acfa23c15306794f63d675ce4c07e5f91f: Status 404 returned error can't find the container with id 9ac76521f6cc5dcc9d9b9a7f6123f6acfa23c15306794f63d675ce4c07e5f91f Apr 17 15:17:09.734099 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.734076 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb501e8_358b_4ede_bb90_e53237beeef0.slice/crio-c77d2841e1d99f058a881ee57f7356120e4582b6cd2677d8262e312064795622 WatchSource:0}: Error finding container c77d2841e1d99f058a881ee57f7356120e4582b6cd2677d8262e312064795622: Status 404 returned error can't find the container with id c77d2841e1d99f058a881ee57f7356120e4582b6cd2677d8262e312064795622 Apr 17 15:17:09.735175 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.735147 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f42d472_7136_4d33_b081_4e8ae758480e.slice/crio-a9251013b286e8e46a1dcc5fcb1cdd67f5899e58a53c306b1d56cdb7f08abc2b WatchSource:0}: Error finding container a9251013b286e8e46a1dcc5fcb1cdd67f5899e58a53c306b1d56cdb7f08abc2b: Status 404 returned error can't find the container with id a9251013b286e8e46a1dcc5fcb1cdd67f5899e58a53c306b1d56cdb7f08abc2b Apr 17 15:17:09.735861 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.735683 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60e7d84_f238_44b7_91dd_6bebb34d4158.slice/crio-9251719e24d34a66d7fba2eb419544aebeb7e188be8677f992c1321965410676 WatchSource:0}: Error finding container 9251719e24d34a66d7fba2eb419544aebeb7e188be8677f992c1321965410676: Status 404 returned error can't find the container with id 9251719e24d34a66d7fba2eb419544aebeb7e188be8677f992c1321965410676 Apr 17 15:17:09.737070 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.737016 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff92ac3_7f84_4934_b020_eda543896879.slice/crio-6d7dac1e78366704570522f1987b0e8c1ddf585651de669eeaecf91f74526689 WatchSource:0}: Error finding container 6d7dac1e78366704570522f1987b0e8c1ddf585651de669eeaecf91f74526689: Status 404 returned error can't find the container with id 6d7dac1e78366704570522f1987b0e8c1ddf585651de669eeaecf91f74526689 Apr 17 15:17:09.738416 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:09.738138 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d75cd8_6c31_4216_a34f_742e9cc2a898.slice/crio-40e8f4a4bab779ef58538cacd40628199dcb5d7c56f314e61cc6dd908fd574f6 WatchSource:0}: Error finding container 40e8f4a4bab779ef58538cacd40628199dcb5d7c56f314e61cc6dd908fd574f6: Status 404 returned error can't find the container with id 40e8f4a4bab779ef58538cacd40628199dcb5d7c56f314e61cc6dd908fd574f6 Apr 17 15:17:09.801853 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:09.801708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:09.801943 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.801865 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:09.801943 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.801881 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:09.801943 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.801891 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:09.801943 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:09.801937 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:10.801923552 +0000 UTC m=+4.193465569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:10.025340 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.025191 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:08 +0000 UTC" deadline="2027-12-09 21:10:27.120502815 +0000 UTC" Apr 17 15:17:10.025340 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.025231 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14429h53m17.095276287s" Apr 17 15:17:10.108616 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.108591 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:10.108767 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.108692 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:10.116998 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.116955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" event={"ID":"8c8041c1c70e4c5187623822d69c931b","Type":"ContainerStarted","Data":"89b80d84ed1f3c5761c5a7e45d43acf6babfbf1d6d98197f4ed39630172dfd87"} Apr 17 15:17:10.118030 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.118008 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"9251719e24d34a66d7fba2eb419544aebeb7e188be8677f992c1321965410676"} Apr 17 15:17:10.118992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.118973 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wn95c" event={"ID":"9fb501e8-358b-4ede-bb90-e53237beeef0","Type":"ContainerStarted","Data":"c77d2841e1d99f058a881ee57f7356120e4582b6cd2677d8262e312064795622"} Apr 17 15:17:10.119830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.119803 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r2ffn" event={"ID":"a2a09a76-6aeb-4520-9efb-287cddc7f75b","Type":"ContainerStarted","Data":"9ac76521f6cc5dcc9d9b9a7f6123f6acfa23c15306794f63d675ce4c07e5f91f"} Apr 17 15:17:10.120897 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.120878 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerStarted","Data":"0fe13b34791fca1d497102f14f1d2bf39f5b2a71cd03d263fb6c2445228543e4"} Apr 17 15:17:10.121838 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.121819 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" event={"ID":"7ef14720-2641-45c2-84ff-e658786a8152","Type":"ContainerStarted","Data":"c17056d390b92e00cd68c90b936e8d07cd44d870d89a9282826356d27f93f6c4"} Apr 17 15:17:10.122788 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.122759 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" event={"ID":"14d75cd8-6c31-4216-a34f-742e9cc2a898","Type":"ContainerStarted","Data":"40e8f4a4bab779ef58538cacd40628199dcb5d7c56f314e61cc6dd908fd574f6"} Apr 17 15:17:10.123663 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.123645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mxqss" event={"ID":"eff92ac3-7f84-4934-b020-eda543896879","Type":"ContainerStarted","Data":"6d7dac1e78366704570522f1987b0e8c1ddf585651de669eeaecf91f74526689"} Apr 17 15:17:10.124507 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.124481 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4n2f7" event={"ID":"1f42d472-7136-4d33-b081-4e8ae758480e","Type":"ContainerStarted","Data":"a9251013b286e8e46a1dcc5fcb1cdd67f5899e58a53c306b1d56cdb7f08abc2b"} Apr 17 15:17:10.125721 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.125699 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-27fdl" event={"ID":"30857904-b776-4486-98cc-f89642587b8a","Type":"ContainerStarted","Data":"e936c8e3df58810cf30a5432132fe9ae991f2d7bd8cafffc0f9ef7a819ba70be"} Apr 17 15:17:10.129122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.129083 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-92.ec2.internal" podStartSLOduration=2.129073024 podStartE2EDuration="2.129073024s" podCreationTimestamp="2026-04-17 15:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:10.128997888 +0000 UTC m=+3.520539924" watchObservedRunningTime="2026-04-17 15:17:10.129073024 +0000 UTC m=+3.520615059" Apr 17 15:17:10.709763 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.709144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:10.709763 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.709322 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:10.709763 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.709385 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:12.709367323 +0000 UTC m=+6.100909349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:10.810791 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:10.810158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:10.810791 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.810332 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:10.810791 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.810351 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:10.810791 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.810365 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:10.810791 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:10.810424 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:12.810406477 +0000 UTC m=+6.201948505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:11.108640 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:11.108558 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:11.109061 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:11.108690 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:11.150108 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:11.148972 2567 generic.go:358] "Generic (PLEG): container finished" podID="71e821ae6b8df1a5a1fd11ea19a5e77f" containerID="ed0697aaf95b7abf872dc36966c7d526e80c6ea176a220809092d90edee6fcc7" exitCode=0 Apr 17 15:17:11.150108 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:11.149875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" event={"ID":"71e821ae6b8df1a5a1fd11ea19a5e77f","Type":"ContainerDied","Data":"ed0697aaf95b7abf872dc36966c7d526e80c6ea176a220809092d90edee6fcc7"} Apr 17 15:17:12.109329 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:12.109266 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:12.109802 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.109424 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:12.169753 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:12.169717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" event={"ID":"71e821ae6b8df1a5a1fd11ea19a5e77f","Type":"ContainerStarted","Data":"4f8662b666a289db59721f2db72f8ea6a92ec7c53a300dd0b8f7f946c4218e55"} Apr 17 15:17:12.182505 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:12.182453 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-92.ec2.internal" podStartSLOduration=4.182436641 podStartE2EDuration="4.182436641s" podCreationTimestamp="2026-04-17 15:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:12.182092141 +0000 UTC m=+5.573634178" watchObservedRunningTime="2026-04-17 15:17:12.182436641 +0000 UTC m=+5.573978678" Apr 17 15:17:12.727945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:12.727394 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:12.727945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.727547 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:12.727945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.727609 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:16.727591036 +0000 UTC m=+10.119133063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:12.829386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:12.828754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:12.829386 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.828918 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:12.829386 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.828937 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:12.829386 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.828949 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:12.829386 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:12.829005 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:16.828987754 +0000 UTC m=+10.220529773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:13.109408 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:13.108855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:13.109408 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:13.108979 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:14.109237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:14.109193 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:14.109444 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:14.109350 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:15.109496 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:15.109057 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:15.109496 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:15.109182 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:16.109173 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.109139 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:16.109361 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.109295 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:16.214987 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.214081 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z8tnm"] Apr 17 15:17:16.217278 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.217246 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.217414 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.217354 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:16.256004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.255800 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-dbus\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.256004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.255874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-kubelet-config\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.256004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.255950 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.356385 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.356347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.356551 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.356408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-dbus\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.356551 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.356464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-kubelet-config\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.356551 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.356530 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:16.356686 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.356597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-kubelet-config\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.357056 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.356796 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:16.856586 +0000 UTC m=+10.248128032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:16.365247 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.365063 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ea6be3e1-87f0-4c68-b704-4a21dbd76850-dbus\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.760285 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.760198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:16.760459 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.760383 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:16.760459 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.760453 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:24.760434894 +0000 UTC m=+18.151976912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:16.861127 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.861081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:16.861296 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:16.861138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:16.861383 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861323 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:16.861441 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861386 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:17.861368707 +0000 UTC m=+11.252910737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:16.861760 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861633 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:16.861760 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861661 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:16.861760 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861675 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:16.861760 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:16.861728 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:24.861711883 +0000 UTC m=+18.253253902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:17.109915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:17.109786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:17.109915 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:17.109902 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:17.867855 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:17.867818 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:17.868299 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:17.867942 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:17.868299 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:17.867992 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:19.867979315 +0000 UTC m=+13.259521331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:18.109355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:18.109279 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:18.109535 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:18.109429 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:18.109804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:18.109774 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:18.109919 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:18.109890 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:19.109325 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.109118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:19.109705 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:19.109434 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:19.182609 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.182577 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r2ffn" event={"ID":"a2a09a76-6aeb-4520-9efb-287cddc7f75b","Type":"ContainerStarted","Data":"d4b8b34ce2d3e7153fc16dbb8c5691251f0ad077e63fcc8dacc2310d5c1cc984"} Apr 17 15:17:19.184154 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.184123 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="e72d6833754d59c12922877a948474ab481cf5562a79f47de9cb3d951f887e52" exitCode=0 Apr 17 15:17:19.184281 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.184198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"e72d6833754d59c12922877a948474ab481cf5562a79f47de9cb3d951f887e52"} Apr 17 15:17:19.185857 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.185833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" event={"ID":"7ef14720-2641-45c2-84ff-e658786a8152","Type":"ContainerStarted","Data":"77bd1c8c5b4f22586cba2506fb831132f3185a8c9761a5016dc2de97cbe304f6"} Apr 17 15:17:19.187805 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.187725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" event={"ID":"14d75cd8-6c31-4216-a34f-742e9cc2a898","Type":"ContainerStarted","Data":"543b0d28d4ccd0b30c632cfc232699600512212eedac3149622337d9b009e555"} Apr 17 15:17:19.190140 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.190116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mxqss" event={"ID":"eff92ac3-7f84-4934-b020-eda543896879","Type":"ContainerStarted","Data":"5bf54cb63dcb56d463383c7fb403bc2fab5314d110476d112125556fad09b499"} Apr 17 15:17:19.191602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.191579 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4n2f7" event={"ID":"1f42d472-7136-4d33-b081-4e8ae758480e","Type":"ContainerStarted","Data":"8fc8275f065ca347b638899f9e0cf3e3e0c7af756a6b18e7db1558be43c49207"} Apr 17 15:17:19.195522 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.195478 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r2ffn" podStartSLOduration=3.378523062 podStartE2EDuration="12.195462972s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.733148107 +0000 UTC m=+3.124690121" lastFinishedPulling="2026-04-17 15:17:18.550088019 +0000 UTC m=+11.941630031" observedRunningTime="2026-04-17 15:17:19.194336442 +0000 UTC m=+12.585878475" watchObservedRunningTime="2026-04-17 15:17:19.195462972 +0000 UTC m=+12.587005008" Apr 17 15:17:19.207955 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.207909 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dxkdc" podStartSLOduration=3.244598984 podStartE2EDuration="12.207894683s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.739793034 +0000 UTC m=+3.131335049" lastFinishedPulling="2026-04-17 15:17:18.703088716 +0000 UTC m=+12.094630748" observedRunningTime="2026-04-17 15:17:19.207411534 +0000 UTC m=+12.598953570" watchObservedRunningTime="2026-04-17 15:17:19.207894683 +0000 UTC m=+12.599436720" Apr 17 15:17:19.242341 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.242283 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4n2f7" podStartSLOduration=3.451178632 podStartE2EDuration="12.242268565s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.736804663 +0000 UTC m=+3.128346694" lastFinishedPulling="2026-04-17 15:17:18.527894611 +0000 UTC m=+11.919436627" observedRunningTime="2026-04-17 15:17:19.218345819 +0000 UTC m=+12.609887852" watchObservedRunningTime="2026-04-17 15:17:19.242268565 +0000 UTC m=+12.633810600" Apr 17 15:17:19.256483 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.256451 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mxqss" podStartSLOduration=3.450145477 podStartE2EDuration="12.256439813s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.739200261 +0000 UTC m=+3.130742275" lastFinishedPulling="2026-04-17 15:17:18.545494591 +0000 UTC m=+11.937036611" observedRunningTime="2026-04-17 15:17:19.256294958 +0000 UTC m=+12.647836993" watchObservedRunningTime="2026-04-17 15:17:19.256439813 +0000 UTC m=+12.647981843" Apr 17 15:17:19.625543 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.625516 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:19.626127 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.626109 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:19.885607 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:19.885519 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:19.885755 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:19.885705 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:19.885817 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:19.885773 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:23.885755082 +0000 UTC m=+17.277297102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:20.108926 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:20.108892 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:20.109059 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:20.109035 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:20.109122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:20.109097 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:20.109219 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:20.109197 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:21.109287 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:21.109243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:21.109859 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:21.109390 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:21.196335 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:21.196282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-27fdl" event={"ID":"30857904-b776-4486-98cc-f89642587b8a","Type":"ContainerStarted","Data":"aa4ce60fa43c1bb36bea459ccaa4441c19d08661041ddc21f8426f0ded08cef7"} Apr 17 15:17:21.196335 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:21.196330 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 15:17:21.208692 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:21.208643 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-27fdl" podStartSLOduration=5.394884166 podStartE2EDuration="14.208625821s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.732173949 +0000 UTC m=+3.123715968" lastFinishedPulling="2026-04-17 15:17:18.545915609 +0000 UTC m=+11.937457623" observedRunningTime="2026-04-17 15:17:21.208141492 +0000 UTC m=+14.599683528" watchObservedRunningTime="2026-04-17 15:17:21.208625821 +0000 UTC m=+14.600168049" Apr 17 15:17:22.109149 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:22.109117 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:22.109354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:22.109117 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:22.109354 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:22.109246 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:22.109354 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:22.109327 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:22.821253 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:22.821222 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:22.821444 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:22.821345 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 15:17:22.822135 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:22.822109 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mxqss" Apr 17 15:17:23.109507 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:23.109436 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:23.110080 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:23.109538 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:23.918629 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:23.918583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:23.918831 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:23.918747 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:23.918831 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:23.918822 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:31.918804932 +0000 UTC m=+25.310346949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:24.108589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:24.108555 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:24.108758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:24.108555 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:24.108758 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.108690 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:24.108758 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.108741 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:24.825433 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:24.825395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:24.825772 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.825544 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:24.825772 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.825614 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:17:40.825595382 +0000 UTC m=+34.217137406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:24.926576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:24.926544 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:24.926717 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.926703 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:24.926758 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.926722 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:24.926758 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.926732 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:24.926813 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:24.926783 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:40.926766541 +0000 UTC m=+34.318308571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:25.108825 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:25.108747 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:25.108982 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:25.108858 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:26.108801 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:26.108771 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:26.109202 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:26.108786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:26.109202 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:26.108871 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:26.109202 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:26.108946 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:27.109778 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:27.109607 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:27.110336 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:27.109860 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:28.108594 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:28.108563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:28.108594 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:28.108585 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:28.108836 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:28.108696 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:28.108836 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:28.108815 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:29.109074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:29.109040 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:29.109567 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:29.109159 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:30.108744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:30.108663 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:30.108947 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:30.108825 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:30.108947 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:30.108849 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:30.109060 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:30.108970 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:30.410813 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:30.410698 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 15:17:31.059521 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.059361 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T15:17:30.410736814Z","UUID":"a0cccd3b-90cb-4198-bc1c-48df63962e97","Handler":null,"Name":"","Endpoint":""} Apr 17 15:17:31.062575 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.062553 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 15:17:31.062575 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.062580 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 15:17:31.108922 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.108896 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:31.109047 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:31.109014 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:31.216620 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216579 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"af2176364288b6e3c509034819d798eebba5adb73edcb674a70f27e9641aaa77"} Apr 17 15:17:31.216812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216630 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"1b6ea3fbf27f468b2a32cde1c37542b8cafd9f955af763f877aca9ad307849c9"} Apr 17 15:17:31.216812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"bafce53a9daadcdd40c41fff34cc96405d882d5cb40df6770303c0cc4c84b2fe"} Apr 17 15:17:31.216812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216669 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"166291fccaf026efb94583f41ba0df55221f5ad1aa46ae59cfb8cc6b773afb77"} Apr 17 15:17:31.216812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216701 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"f35c637b3b25695b1822d1a7b23fccd655dd40dd20b881f109388167fd9e3292"} Apr 17 15:17:31.216812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.216719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"d7fb00f6243ef7b108487c2aa81c6e2326b37bda1f71229ab9634241e7f2d090"} Apr 17 15:17:31.218157 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.218131 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wn95c" event={"ID":"9fb501e8-358b-4ede-bb90-e53237beeef0","Type":"ContainerStarted","Data":"22e3d78fcddfc35fd91b34b8b5b1359e8956f306d7f80df4aec45de8555e64fe"} Apr 17 15:17:31.219915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.219889 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="ccb1613985dcfcf113e3bbf486d452f4759c36d60ae1f272570f488905248a5c" exitCode=0 Apr 17 15:17:31.220028 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.219962 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"ccb1613985dcfcf113e3bbf486d452f4759c36d60ae1f272570f488905248a5c"} Apr 17 15:17:31.221450 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.221393 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" event={"ID":"7ef14720-2641-45c2-84ff-e658786a8152","Type":"ContainerStarted","Data":"3915419c0391f12090a6955b3cb0a562dd68c2d1eab51ac62e90438385313c6b"} Apr 17 15:17:31.247564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.246520 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wn95c" podStartSLOduration=3.679037697 podStartE2EDuration="24.246499013s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.735980549 +0000 UTC m=+3.127522577" lastFinishedPulling="2026-04-17 15:17:30.303441867 +0000 UTC m=+23.694983893" observedRunningTime="2026-04-17 15:17:31.245258138 +0000 UTC m=+24.636800173" watchObservedRunningTime="2026-04-17 15:17:31.246499013 +0000 UTC m=+24.638041049" Apr 17 15:17:31.981187 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:31.981098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:31.981891 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:31.981274 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:31.981891 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:31.981363 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret podName:ea6be3e1-87f0-4c68-b704-4a21dbd76850 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:47.981342991 +0000 UTC m=+41.372885006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret") pod "global-pull-secret-syncer-z8tnm" (UID: "ea6be3e1-87f0-4c68-b704-4a21dbd76850") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:32.108766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:32.108729 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:32.108766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:32.108746 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:32.108934 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:32.108835 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:32.108934 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:32.108891 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:32.224963 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:32.224928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" event={"ID":"7ef14720-2641-45c2-84ff-e658786a8152","Type":"ContainerStarted","Data":"8f703bb2900cfa104647995f9ef866b17bec9d86a97cf159faa9bc6a4f503833"} Apr 17 15:17:32.241271 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:32.241200 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lvc6x" podStartSLOduration=3.323336996 podStartE2EDuration="25.241187554s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.727742265 +0000 UTC m=+3.119284277" lastFinishedPulling="2026-04-17 15:17:31.645592818 +0000 UTC m=+25.037134835" observedRunningTime="2026-04-17 15:17:32.2410265 +0000 UTC m=+25.632568535" watchObservedRunningTime="2026-04-17 15:17:32.241187554 +0000 UTC m=+25.632729588" Apr 17 15:17:33.109206 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:33.109171 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:33.109660 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:33.109263 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:33.229831 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:33.229746 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"fcf5039453e9a87226aa47ce023cda227190f3c3275159e5595503539533bb11"} Apr 17 15:17:33.231576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:33.231550 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="aca367f94ba5f3d1fea3d81715d1deac8e390102a667cdcb55ad4c4153daa5e8" exitCode=0 Apr 17 15:17:33.231676 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:33.231633 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"aca367f94ba5f3d1fea3d81715d1deac8e390102a667cdcb55ad4c4153daa5e8"} Apr 17 15:17:34.108928 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:34.108897 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:34.109077 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:34.108902 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:34.109077 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:34.108994 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:34.109152 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:34.109088 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:35.109508 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.109134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:35.109508 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:35.109389 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:35.236354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.236204 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="fff22092fe55e2d19786612a5fda25ccef7f5053a24306414d7288f82f9e317b" exitCode=0 Apr 17 15:17:35.236354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.236294 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"fff22092fe55e2d19786612a5fda25ccef7f5053a24306414d7288f82f9e317b"} Apr 17 15:17:35.239631 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.239609 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" event={"ID":"c60e7d84-f238-44b7-91dd-6bebb34d4158","Type":"ContainerStarted","Data":"3bcdd37499609c8ea1ef8c7b08fc60ba17b3e91ecfab8664d8290abe758fe560"} Apr 17 15:17:35.239908 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.239891 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:35.239961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.239916 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:35.239961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.239928 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:35.254400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.254380 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:35.254507 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.254485 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:17:35.289502 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:35.289456 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" podStartSLOduration=7.73419788 podStartE2EDuration="28.289443036s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.737929008 +0000 UTC m=+3.129471033" lastFinishedPulling="2026-04-17 15:17:30.293174167 +0000 UTC m=+23.684716189" observedRunningTime="2026-04-17 15:17:35.287702615 +0000 UTC m=+28.679244662" watchObservedRunningTime="2026-04-17 15:17:35.289443036 +0000 UTC m=+28.680985071" Apr 17 15:17:36.109045 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:36.109014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:36.109212 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:36.109014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:36.109212 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:36.109138 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:36.109305 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:36.109228 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:37.000564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.000533 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n8fjz"] Apr 17 15:17:37.000999 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.000657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:37.000999 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:37.000818 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:37.002643 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.002617 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6wr6j"] Apr 17 15:17:37.002779 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.002754 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:37.002857 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:37.002828 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:37.003383 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.003356 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z8tnm"] Apr 17 15:17:37.003501 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:37.003425 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:37.003501 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:37.003490 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:39.109026 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:39.108995 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:39.109442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:39.109035 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:39.109442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:39.108996 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:39.109442 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:39.109139 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:39.109442 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:39.109232 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:39.109442 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:39.109342 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:40.852330 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:40.852279 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:40.852775 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.852464 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:40.852775 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.852542 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:18:12.852522413 +0000 UTC m=+66.244064443 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:40.952811 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:40.952770 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:40.952988 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.952969 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:40.953048 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.952990 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:40.953048 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.953000 2567 projected.go:194] Error preparing data for projected volume kube-api-access-26mfb for pod openshift-network-diagnostics/network-check-target-6wr6j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:40.953048 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:40.953046 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb podName:d9daeb55-7347-4d29-a0ea-04ac78140a08 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:12.953033206 +0000 UTC m=+66.344575219 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-26mfb" (UniqueName: "kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb") pod "network-check-target-6wr6j" (UID: "d9daeb55-7347-4d29-a0ea-04ac78140a08") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:41.108858 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:41.108770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:41.109020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:41.108771 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:41.109020 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:41.108907 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z8tnm" podUID="ea6be3e1-87f0-4c68-b704-4a21dbd76850" Apr 17 15:17:41.109020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:41.108771 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:41.109020 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:41.108977 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wr6j" podUID="d9daeb55-7347-4d29-a0ea-04ac78140a08" Apr 17 15:17:41.109223 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:41.109066 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:17:43.008586 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.008356 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-92.ec2.internal" event="NodeReady" Apr 17 15:17:43.008975 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.008713 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 15:17:43.042799 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.042769 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:17:43.073512 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.073491 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c"] Apr 17 15:17:43.073682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.073663 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.076430 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.076406 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 15:17:43.076543 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.076430 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 15:17:43.076543 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.076466 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jck6\"" Apr 17 15:17:43.076636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.076559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 15:17:43.082793 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.082773 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 15:17:43.098071 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.098046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c"] Apr 17 15:17:43.098071 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.098067 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:17:43.098071 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.098078 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jpn8m"] Apr 17 15:17:43.098275 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.098180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.100750 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.100727 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hcrsv\"" Apr 17 15:17:43.100846 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.100757 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 15:17:43.100846 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.100729 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 15:17:43.131138 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.131110 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-78c86"] Apr 17 15:17:43.131258 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.131234 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:17:43.131325 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.131273 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.131524 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.131445 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:43.131524 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.131468 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:17:43.133697 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.133678 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.134624 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.134723 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.134784 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.134809 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.135041 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.135088 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.135528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.135906 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sx2cz\"" Apr 17 15:17:43.136375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.136129 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:17:43.149906 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.149883 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpn8m"] Apr 17 15:17:43.149994 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.149913 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-78c86"] Apr 17 15:17:43.150057 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.150030 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.152461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.152444 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:17:43.152546 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.152444 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 15:17:43.152705 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.152692 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 15:17:43.171141 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171121 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171149 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171174 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171303 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171303 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171274 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171395 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171429 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9zs\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.171466 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.171452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.256808 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.256781 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="d33da7b97bed76a93ed417fe46f428a821c73c9cdb463b05017f07f9aa24b2c6" exitCode=0 Apr 17 15:17:43.256904 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.256850 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"d33da7b97bed76a93ed417fe46f428a821c73c9cdb463b05017f07f9aa24b2c6"} Apr 17 15:17:43.272562 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.272617 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272592 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.272663 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.272663 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9zs\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.272663 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272657 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.272814 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.272670 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:43.272814 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.272686 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:43.272814 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.272739 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:17:43.772718683 +0000 UTC m=+37.164260703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:43.272814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272776 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.272814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2ad6199-9c63-412c-b433-b95e9dec556b-tmp-dir\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.273053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272833 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.273053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272976 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt648\" (UniqueName: \"kubernetes.io/projected/eeb34dd9-f023-4a23-8830-151d5b605625-kube-api-access-qt648\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.273053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.272998 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn444\" (UniqueName: \"kubernetes.io/projected/b2ad6199-9c63-412c-b433-b95e9dec556b-kube-api-access-kn444\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.273238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.273238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2ad6199-9c63-412c-b433-b95e9dec556b-config-volume\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.273238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.273849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.273830 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.276816 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.276791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.276948 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.276925 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.282532 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.282486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.282632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.282615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9zs\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.373919 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.373891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.374035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.373945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.374035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.373972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2ad6199-9c63-412c-b433-b95e9dec556b-tmp-dir\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.374035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.374190 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt648\" (UniqueName: \"kubernetes.io/projected/eeb34dd9-f023-4a23-8830-151d5b605625-kube-api-access-qt648\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.374190 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn444\" (UniqueName: \"kubernetes.io/projected/b2ad6199-9c63-412c-b433-b95e9dec556b-kube-api-access-kn444\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.374190 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374104 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:43.374190 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374176 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:43.374407 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374183 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:17:43.874159843 +0000 UTC m=+37.265701873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:43.374407 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374247 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.374407 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2ad6199-9c63-412c-b433-b95e9dec556b-config-volume\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.374407 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2ad6199-9c63-412c-b433-b95e9dec556b-tmp-dir\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.374407 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374363 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:17:43.874343878 +0000 UTC m=+37.265885896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:17:43.374679 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374450 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:43.374679 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.374494 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:43.874483475 +0000 UTC m=+37.266025501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:43.374679 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.374985 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.374959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2ad6199-9c63-412c-b433-b95e9dec556b-config-volume\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.381162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.381137 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn444\" (UniqueName: \"kubernetes.io/projected/b2ad6199-9c63-412c-b433-b95e9dec556b-kube-api-access-kn444\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.381927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.381902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt648\" (UniqueName: \"kubernetes.io/projected/eeb34dd9-f023-4a23-8830-151d5b605625-kube-api-access-qt648\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.778077 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.778039 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:43.778250 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.778170 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:43.778250 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.778185 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:43.778250 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.778238 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:17:44.77822321 +0000 UTC m=+38.169765224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:43.879006 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.878971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:43.879169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.879010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:43.879169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:43.879048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:43.879169 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879127 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:43.879169 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879143 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:43.879390 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879169 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:43.879390 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879198 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:17:44.87917798 +0000 UTC m=+38.270719994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:43.879390 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879216 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:17:44.879207138 +0000 UTC m=+38.270749152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:17:43.879390 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:43.879233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:44.879223114 +0000 UTC m=+38.270765128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:44.261298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.261267 2567 generic.go:358] "Generic (PLEG): container finished" podID="ffaaed56-110b-4fd5-9fbe-e8e71f6de33d" containerID="43e08efebd455dd94dc014ad311713aa0e39d56cf3b7d2fd1a9ab40aaa3c9afd" exitCode=0 Apr 17 15:17:44.261916 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.261325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerDied","Data":"43e08efebd455dd94dc014ad311713aa0e39d56cf3b7d2fd1a9ab40aaa3c9afd"} Apr 17 15:17:44.786804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.786766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:44.786971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.786937 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:44.786971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.786960 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:44.787050 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.787025 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:17:46.787009588 +0000 UTC m=+40.178551607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:44.887628 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.887593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.887633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:44.887667 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887764 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887769 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887813 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:46.887800762 +0000 UTC m=+40.279342779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887764 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887826 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:17:46.887820819 +0000 UTC m=+40.279362832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:44.887945 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:44.887860 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:17:46.88784181 +0000 UTC m=+40.279383823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:17:45.265939 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:45.265899 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhh97" event={"ID":"ffaaed56-110b-4fd5-9fbe-e8e71f6de33d","Type":"ContainerStarted","Data":"dd03d6c04f47524321598a67a7f8966c356d334fbe34f0f09169623924f08abc"} Apr 17 15:17:45.291199 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:45.291156 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fhh97" podStartSLOduration=5.086412953 podStartE2EDuration="38.291143375s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:17:09.730788564 +0000 UTC m=+3.122330593" lastFinishedPulling="2026-04-17 15:17:42.935519002 +0000 UTC m=+36.327061015" observedRunningTime="2026-04-17 15:17:45.289584367 +0000 UTC m=+38.681126403" watchObservedRunningTime="2026-04-17 15:17:45.291143375 +0000 UTC m=+38.682685409" Apr 17 15:17:46.801003 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:46.800922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:46.801356 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.801069 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:46.801356 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.801088 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:46.801356 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.801142 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.801127978 +0000 UTC m=+44.192669991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:46.901772 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:46.901735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:46.901772 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:46.901773 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:46.901971 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:46.901807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:46.901971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.901882 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:46.901971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.901904 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:46.901971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.901942 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.90192725 +0000 UTC m=+44.293469267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:46.901971 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.901957 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.90195088 +0000 UTC m=+44.293492892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:17:46.902145 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.901904 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:46.902145 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:46.902031 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.902015643 +0000 UTC m=+44.293557666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:48.008483 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:48.008444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:48.010651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:48.010632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ea6be3e1-87f0-4c68-b704-4a21dbd76850-original-pull-secret\") pod \"global-pull-secret-syncer-z8tnm\" (UID: \"ea6be3e1-87f0-4c68-b704-4a21dbd76850\") " pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:48.250341 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:48.250284 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z8tnm" Apr 17 15:17:48.418731 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:48.418565 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z8tnm"] Apr 17 15:17:48.422498 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:17:48.422471 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6be3e1_87f0_4c68_b704_4a21dbd76850.slice/crio-16494577545c567bc6d6b85c4098937ee3899e8a586c99a41027f675d7c1d5f2 WatchSource:0}: Error finding container 16494577545c567bc6d6b85c4098937ee3899e8a586c99a41027f675d7c1d5f2: Status 404 returned error can't find the container with id 16494577545c567bc6d6b85c4098937ee3899e8a586c99a41027f675d7c1d5f2 Apr 17 15:17:49.274946 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:49.274908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z8tnm" event={"ID":"ea6be3e1-87f0-4c68-b704-4a21dbd76850","Type":"ContainerStarted","Data":"16494577545c567bc6d6b85c4098937ee3899e8a586c99a41027f675d7c1d5f2"} Apr 17 15:17:50.831518 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:50.831482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:50.831879 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.831631 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:50.831879 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.831650 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:50.831879 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.831706 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.831688811 +0000 UTC m=+52.223230827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:50.932603 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:50.932570 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:50.932772 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:50.932615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:50.932772 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:50.932651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:50.932772 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932739 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:50.932906 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932792 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:50.932906 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932739 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:50.932906 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932799 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.932784885 +0000 UTC m=+52.324326897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:50.932906 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932868 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.932849968 +0000 UTC m=+52.324391989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:17:50.932906 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:50.932886 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.932875502 +0000 UTC m=+52.324417521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:54.285823 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:54.285730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z8tnm" event={"ID":"ea6be3e1-87f0-4c68-b704-4a21dbd76850","Type":"ContainerStarted","Data":"6a001f33b0df48e77c5a412c8290bd9265d6efcaec2d5aa3bd6fe0b613f7dbdd"} Apr 17 15:17:54.299907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:54.299866 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z8tnm" podStartSLOduration=32.829843555 podStartE2EDuration="38.299852208s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:48.424522907 +0000 UTC m=+41.816064920" lastFinishedPulling="2026-04-17 15:17:53.894531556 +0000 UTC m=+47.286073573" observedRunningTime="2026-04-17 15:17:54.29890989 +0000 UTC m=+47.690451934" watchObservedRunningTime="2026-04-17 15:17:54.299852208 +0000 UTC m=+47.691394240" Apr 17 15:17:58.887111 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:58.887067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:17:58.887643 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.887238 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:58.887643 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.887259 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:17:58.887643 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.887336 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:18:14.887299263 +0000 UTC m=+68.278841275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:17:58.987893 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:58.987858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:17:58.988002 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:58.987898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:17:58.988002 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:17:58.987924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:17:58.988065 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988023 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:17:58.988065 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988037 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:58.988127 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988023 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:58.988127 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988084 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:14.988070694 +0000 UTC m=+68.379612707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:17:58.988127 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988101 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:18:14.988088652 +0000 UTC m=+68.379630665 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:17:58.988127 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:17:58.988115 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:18:14.988107468 +0000 UTC m=+68.379649480 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:18:07.282891 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:07.282862 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rphqm" Apr 17 15:18:12.887542 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:12.887506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:18:12.890083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:12.890067 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 15:18:12.898090 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:12.898071 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:18:12.898143 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:12.898125 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:19:16.89811049 +0000 UTC m=+130.289652503 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : secret "metrics-daemon-secret" not found Apr 17 15:18:12.988016 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:12.987989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:18:12.991054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:12.991036 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 15:18:13.001468 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.001449 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 15:18:13.012834 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.012812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26mfb\" (UniqueName: \"kubernetes.io/projected/d9daeb55-7347-4d29-a0ea-04ac78140a08-kube-api-access-26mfb\") pod \"network-check-target-6wr6j\" (UID: \"d9daeb55-7347-4d29-a0ea-04ac78140a08\") " pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:18:13.165647 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.165559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sx2cz\"" Apr 17 15:18:13.173528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.173507 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:18:13.297989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.297961 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6wr6j"] Apr 17 15:18:13.300750 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:18:13.300719 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9daeb55_7347_4d29_a0ea_04ac78140a08.slice/crio-ab24e33c38e28b2c94047ad3ea7e18c914518e826a16a8110e8447404b8f399a WatchSource:0}: Error finding container ab24e33c38e28b2c94047ad3ea7e18c914518e826a16a8110e8447404b8f399a: Status 404 returned error can't find the container with id ab24e33c38e28b2c94047ad3ea7e18c914518e826a16a8110e8447404b8f399a Apr 17 15:18:13.321442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:13.321413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6wr6j" event={"ID":"d9daeb55-7347-4d29-a0ea-04ac78140a08","Type":"ContainerStarted","Data":"ab24e33c38e28b2c94047ad3ea7e18c914518e826a16a8110e8447404b8f399a"} Apr 17 15:18:14.903138 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:14.903103 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:18:14.903621 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:14.903276 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:18:14.903621 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:14.903301 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:18:14.903621 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:14.903399 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:18:46.903376716 +0000 UTC m=+100.294918747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:18:15.004482 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:15.004444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:18:15.004658 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:15.004560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:18:15.004658 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:15.004597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:18:15.004658 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004615 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:15.004797 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004660 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:15.004797 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004686 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:47.004666527 +0000 UTC m=+100.396208568 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:18:15.004797 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004704 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:18:47.004696551 +0000 UTC m=+100.396238578 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:18:15.004797 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004712 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:18:15.004797 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:15.004765 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:18:47.004748822 +0000 UTC m=+100.396290852 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:18:16.329771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:16.329689 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6wr6j" event={"ID":"d9daeb55-7347-4d29-a0ea-04ac78140a08","Type":"ContainerStarted","Data":"7548ef14c3b597267dbf1682e5b9824f4e8a7ed37407ec8ddcce9b524170ab46"} Apr 17 15:18:16.330107 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:16.329800 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:18:16.344936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:16.344894 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6wr6j" podStartSLOduration=66.623786724 podStartE2EDuration="1m9.344876583s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:18:13.302589076 +0000 UTC m=+66.694131090" lastFinishedPulling="2026-04-17 15:18:16.023678933 +0000 UTC m=+69.415220949" observedRunningTime="2026-04-17 15:18:16.344029311 +0000 UTC m=+69.735571346" watchObservedRunningTime="2026-04-17 15:18:16.344876583 +0000 UTC m=+69.736418615" Apr 17 15:18:46.938135 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:46.938096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:18:46.938546 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:46.938247 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:18:46.938546 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:46.938269 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c649d5796-5gdb8: secret "image-registry-tls" not found Apr 17 15:18:46.938546 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:46.938372 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls podName:8571a62a-9563-470f-aa7c-a31197ec34fd nodeName:}" failed. No retries permitted until 2026-04-17 15:19:50.938353557 +0000 UTC m=+164.329895591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls") pod "image-registry-7c649d5796-5gdb8" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd") : secret "image-registry-tls" not found Apr 17 15:18:47.039257 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:47.039215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:18:47.039257 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:47.039265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:18:47.039430 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:47.039296 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:18:47.039430 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039390 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 15:18:47.039430 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039425 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:47.039521 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039389 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:47.039521 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039470 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert podName:cf60d958-a515-4c65-8fd2-9bd9d19fa3ab nodeName:}" failed. No retries permitted until 2026-04-17 15:19:51.039451614 +0000 UTC m=+164.430993638 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lmx8c" (UID: "cf60d958-a515-4c65-8fd2-9bd9d19fa3ab") : secret "networking-console-plugin-cert" not found Apr 17 15:18:47.039521 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039489 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls podName:b2ad6199-9c63-412c-b433-b95e9dec556b nodeName:}" failed. No retries permitted until 2026-04-17 15:19:51.039477349 +0000 UTC m=+164.431019363 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls") pod "dns-default-78c86" (UID: "b2ad6199-9c63-412c-b433-b95e9dec556b") : secret "dns-default-metrics-tls" not found Apr 17 15:18:47.039521 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:18:47.039504 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert podName:eeb34dd9-f023-4a23-8830-151d5b605625 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:51.039496407 +0000 UTC m=+164.431038422 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert") pod "ingress-canary-jpn8m" (UID: "eeb34dd9-f023-4a23-8830-151d5b605625") : secret "canary-serving-cert" not found Apr 17 15:18:47.334491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:18:47.334417 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6wr6j" Apr 17 15:19:07.073091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.073058 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7"] Apr 17 15:19:07.075926 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.075910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.079831 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.079808 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 15:19:07.081016 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.080996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 15:19:07.081145 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.081040 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 15:19:07.081635 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.081341 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 15:19:07.081635 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.081362 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2swmc\"" Apr 17 15:19:07.084634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.084615 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7"] Apr 17 15:19:07.090447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.090426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8zj5\" (UniqueName: \"kubernetes.io/projected/4261f15f-644e-4914-8e45-1bfa8a2447d7-kube-api-access-x8zj5\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.090557 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.090472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4261f15f-644e-4914-8e45-1bfa8a2447d7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.090557 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.090490 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.171731 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.171698 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wvmmd"] Apr 17 15:19:07.174608 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.174586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.174992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.174968 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-58ccc558bb-xngk4"] Apr 17 15:19:07.177156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.177132 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 15:19:07.177451 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.177426 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-tlj9q\"" Apr 17 15:19:07.177660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.177635 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 15:19:07.177779 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.177760 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 15:19:07.177912 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.177894 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 15:19:07.179897 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.179615 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.181986 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.181963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 15:19:07.182105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.181987 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 15:19:07.182599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.182578 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hjh72\"" Apr 17 15:19:07.182826 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.182808 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 15:19:07.182992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.182952 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 15:19:07.183239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.183216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wvmmd"] Apr 17 15:19:07.183239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.182966 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 15:19:07.183516 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.182953 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 15:19:07.183932 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.183900 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 15:19:07.188375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.188354 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58ccc558bb-xngk4"] Apr 17 15:19:07.191394 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.191489 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-snapshots\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.191535 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191479 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb65\" (UniqueName: \"kubernetes.io/projected/4270e3fe-b069-4a89-bd6d-10514be6fb65-kube-api-access-jmb65\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.191535 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-stats-auth\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.191613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8zj5\" (UniqueName: \"kubernetes.io/projected/4261f15f-644e-4914-8e45-1bfa8a2447d7-kube-api-access-x8zj5\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.191613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-default-certificate\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.191699 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191653 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4270e3fe-b069-4a89-bd6d-10514be6fb65-serving-cert\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.191849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4261f15f-644e-4914-8e45-1bfa8a2447d7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.191849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.191849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-tmp\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-service-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8xp\" (UniqueName: \"kubernetes.io/projected/080f2dec-182b-40e8-adf6-95cf8c5342c7-kube-api-access-hp8xp\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.191938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.191947 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:07.192014 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.192013 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:07.691995456 +0000 UTC m=+121.083537476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:07.192580 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.192561 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4261f15f-644e-4914-8e45-1bfa8a2447d7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.201700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.201677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8zj5\" (UniqueName: \"kubernetes.io/projected/4261f15f-644e-4914-8e45-1bfa8a2447d7-kube-api-access-x8zj5\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.292589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-snapshots\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmb65\" (UniqueName: \"kubernetes.io/projected/4270e3fe-b069-4a89-bd6d-10514be6fb65-kube-api-access-jmb65\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-stats-auth\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-default-certificate\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4270e3fe-b069-4a89-bd6d-10514be6fb65-serving-cert\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.292757 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.292725 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:07.792701581 +0000 UTC m=+121.184243619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:07.293083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-tmp\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.293083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292806 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.293083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-service-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.293083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp8xp\" (UniqueName: \"kubernetes.io/projected/080f2dec-182b-40e8-adf6-95cf8c5342c7-kube-api-access-hp8xp\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.293083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.292895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.293340 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.293239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-tmp\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.293404 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.293354 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:07.293454 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.293419 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:07.793401241 +0000 UTC m=+121.184943259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:07.293454 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.293434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4270e3fe-b069-4a89-bd6d-10514be6fb65-snapshots\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.293730 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.293701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.294134 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.294115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4270e3fe-b069-4a89-bd6d-10514be6fb65-service-ca-bundle\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.295084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.295057 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-stats-auth\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.295306 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.295286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4270e3fe-b069-4a89-bd6d-10514be6fb65-serving-cert\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.295518 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.295498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-default-certificate\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.300483 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.300465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmb65\" (UniqueName: \"kubernetes.io/projected/4270e3fe-b069-4a89-bd6d-10514be6fb65-kube-api-access-jmb65\") pod \"insights-operator-585dfdc468-wvmmd\" (UID: \"4270e3fe-b069-4a89-bd6d-10514be6fb65\") " pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.300754 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.300731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp8xp\" (UniqueName: \"kubernetes.io/projected/080f2dec-182b-40e8-adf6-95cf8c5342c7-kube-api-access-hp8xp\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.488020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.487982 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" Apr 17 15:19:07.597491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.597462 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wvmmd"] Apr 17 15:19:07.601799 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:07.601772 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4270e3fe_b069_4a89_bd6d_10514be6fb65.slice/crio-c2b881b50d8ddd70853735242953dd42813075f79d997cbe333d72a99d4f1749 WatchSource:0}: Error finding container c2b881b50d8ddd70853735242953dd42813075f79d997cbe333d72a99d4f1749: Status 404 returned error can't find the container with id c2b881b50d8ddd70853735242953dd42813075f79d997cbe333d72a99d4f1749 Apr 17 15:19:07.696643 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.696611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:07.696784 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.696769 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:07.696839 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.696830 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:08.69681292 +0000 UTC m=+122.088354941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:07.797489 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.797426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.797581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:07.797489 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:07.797581 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.797563 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:07.797652 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.797623 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:08.797607866 +0000 UTC m=+122.189149880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:07.797652 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:07.797638 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:08.797631586 +0000 UTC m=+122.189173599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:08.430865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:08.430833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" event={"ID":"4270e3fe-b069-4a89-bd6d-10514be6fb65","Type":"ContainerStarted","Data":"c2b881b50d8ddd70853735242953dd42813075f79d997cbe333d72a99d4f1749"} Apr 17 15:19:08.703906 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:08.703819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:08.704043 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:08.703970 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:08.704043 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:08.704035 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:10.704020056 +0000 UTC m=+124.095562070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:08.804637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:08.804606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:08.804817 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:08.804707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:08.804817 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:08.804794 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:10.804771984 +0000 UTC m=+124.196314013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:08.804941 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:08.804855 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:08.804941 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:08.804915 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:10.804896161 +0000 UTC m=+124.196438194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:10.436078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:10.436043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" event={"ID":"4270e3fe-b069-4a89-bd6d-10514be6fb65","Type":"ContainerStarted","Data":"e6c4d07455cb865fab8bd60310eae9d48aee9d122bed51bd843b5a84a2a7074a"} Apr 17 15:19:10.453996 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:10.453944 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" podStartSLOduration=0.726174553 podStartE2EDuration="3.453927572s" podCreationTimestamp="2026-04-17 15:19:07 +0000 UTC" firstStartedPulling="2026-04-17 15:19:07.603547534 +0000 UTC m=+120.995089547" lastFinishedPulling="2026-04-17 15:19:10.331300542 +0000 UTC m=+123.722842566" observedRunningTime="2026-04-17 15:19:10.452240905 +0000 UTC m=+123.843782952" watchObservedRunningTime="2026-04-17 15:19:10.453927572 +0000 UTC m=+123.845469608" Apr 17 15:19:10.722924 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:10.722884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:10.723085 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:10.723016 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:10.723085 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:10.723081 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:14.723065857 +0000 UTC m=+128.114607871 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:10.823837 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:10.823797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:10.824018 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:10.823863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:10.824018 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:10.823945 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:10.824018 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:10.823987 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:14.823973764 +0000 UTC m=+128.215515782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:10.824018 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:10.824001 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:14.823995485 +0000 UTC m=+128.215537498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:14.753711 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:14.753679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:14.754155 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:14.753830 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:14.754155 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:14.753894 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:22.753878861 +0000 UTC m=+136.145420874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:14.767465 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:14.767444 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r2ffn_a2a09a76-6aeb-4520-9efb-287cddc7f75b/dns-node-resolver/0.log" Apr 17 15:19:14.854957 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:14.854932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:14.855042 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:14.854990 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:14.855083 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:14.855069 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:14.855131 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:14.855122 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:22.855108475 +0000 UTC m=+136.246650488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:14.855173 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:14.855138 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:22.855131859 +0000 UTC m=+136.246673871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:15.367652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:15.367624 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4n2f7_1f42d472-7136-4d33-b081-4e8ae758480e/node-ca/0.log" Apr 17 15:19:16.973115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:16.973083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:19:16.973526 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:16.973240 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:19:16.973526 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:16.973303 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs podName:48990b07-a036-41ef-a6cd-89d7520c417c nodeName:}" failed. No retries permitted until 2026-04-17 15:21:18.973285243 +0000 UTC m=+252.364827261 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs") pod "network-metrics-daemon-n8fjz" (UID: "48990b07-a036-41ef-a6cd-89d7520c417c") : secret "metrics-daemon-secret" not found Apr 17 15:19:17.055645 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.055616 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7"] Apr 17 15:19:17.058677 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.058659 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" Apr 17 15:19:17.061215 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.061193 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:17.061348 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.061193 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 15:19:17.061348 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.061203 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-hzmm2\"" Apr 17 15:19:17.065447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.065413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7"] Apr 17 15:19:17.168902 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.168870 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4"] Apr 17 15:19:17.171721 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.171704 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.175487 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.175465 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sljps\" (UniqueName: \"kubernetes.io/projected/4dc26639-dd19-46cc-97ba-c69e0b27c74e-kube-api-access-sljps\") pod \"volume-data-source-validator-7c6cbb6c87-kx2v7\" (UID: \"4dc26639-dd19-46cc-97ba-c69e0b27c74e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" Apr 17 15:19:17.175820 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.175806 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 15:19:17.175999 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.175982 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5mcf8\"" Apr 17 15:19:17.176108 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.176089 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 15:19:17.176167 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.176154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 15:19:17.176222 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.176170 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:17.182365 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.182334 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4"] Apr 17 15:19:17.276475 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.276400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7lx\" (UniqueName: \"kubernetes.io/projected/9080ca44-1027-491b-9bf1-12443cd3b452-kube-api-access-2d7lx\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.276475 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.276434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080ca44-1027-491b-9bf1-12443cd3b452-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.276655 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.276495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sljps\" (UniqueName: \"kubernetes.io/projected/4dc26639-dd19-46cc-97ba-c69e0b27c74e-kube-api-access-sljps\") pod \"volume-data-source-validator-7c6cbb6c87-kx2v7\" (UID: \"4dc26639-dd19-46cc-97ba-c69e0b27c74e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" Apr 17 15:19:17.276655 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.276538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9080ca44-1027-491b-9bf1-12443cd3b452-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.284940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.284916 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sljps\" (UniqueName: \"kubernetes.io/projected/4dc26639-dd19-46cc-97ba-c69e0b27c74e-kube-api-access-sljps\") pod \"volume-data-source-validator-7c6cbb6c87-kx2v7\" (UID: \"4dc26639-dd19-46cc-97ba-c69e0b27c74e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" Apr 17 15:19:17.367877 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.367847 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" Apr 17 15:19:17.377759 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.377731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7lx\" (UniqueName: \"kubernetes.io/projected/9080ca44-1027-491b-9bf1-12443cd3b452-kube-api-access-2d7lx\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.377893 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.377776 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080ca44-1027-491b-9bf1-12443cd3b452-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.377960 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.377897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9080ca44-1027-491b-9bf1-12443cd3b452-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.378360 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.378336 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080ca44-1027-491b-9bf1-12443cd3b452-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.380204 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.380182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9080ca44-1027-491b-9bf1-12443cd3b452-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.385037 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.385016 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7lx\" (UniqueName: \"kubernetes.io/projected/9080ca44-1027-491b-9bf1-12443cd3b452-kube-api-access-2d7lx\") pod \"kube-storage-version-migrator-operator-6769c5d45-hchf4\" (UID: \"9080ca44-1027-491b-9bf1-12443cd3b452\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.478575 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.477974 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7"] Apr 17 15:19:17.481369 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.481159 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" Apr 17 15:19:17.482338 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:17.482291 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc26639_dd19_46cc_97ba_c69e0b27c74e.slice/crio-e0d0362cbf4cd1cf9541c765cc90ce71da0854beb021b0d528fcaae52e5771ae WatchSource:0}: Error finding container e0d0362cbf4cd1cf9541c765cc90ce71da0854beb021b0d528fcaae52e5771ae: Status 404 returned error can't find the container with id e0d0362cbf4cd1cf9541c765cc90ce71da0854beb021b0d528fcaae52e5771ae Apr 17 15:19:17.588726 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:17.588694 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4"] Apr 17 15:19:17.592194 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:17.592169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9080ca44_1027_491b_9bf1_12443cd3b452.slice/crio-fee5bcefed809b55a5d8628cf9580abcd4d8390e32fade2465d45f8a50903a8a WatchSource:0}: Error finding container fee5bcefed809b55a5d8628cf9580abcd4d8390e32fade2465d45f8a50903a8a: Status 404 returned error can't find the container with id fee5bcefed809b55a5d8628cf9580abcd4d8390e32fade2465d45f8a50903a8a Apr 17 15:19:18.453619 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:18.453564 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" event={"ID":"9080ca44-1027-491b-9bf1-12443cd3b452","Type":"ContainerStarted","Data":"fee5bcefed809b55a5d8628cf9580abcd4d8390e32fade2465d45f8a50903a8a"} Apr 17 15:19:18.454639 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:18.454616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" event={"ID":"4dc26639-dd19-46cc-97ba-c69e0b27c74e","Type":"ContainerStarted","Data":"e0d0362cbf4cd1cf9541c765cc90ce71da0854beb021b0d528fcaae52e5771ae"} Apr 17 15:19:19.457588 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:19.457549 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" event={"ID":"4dc26639-dd19-46cc-97ba-c69e0b27c74e","Type":"ContainerStarted","Data":"0b196a5cbe79303fe4f4b80ff355430c3eb357848ff3eac242e66a91d4140f1e"} Apr 17 15:19:19.471569 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:19.471532 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kx2v7" podStartSLOduration=1.013449767 podStartE2EDuration="2.471519882s" podCreationTimestamp="2026-04-17 15:19:17 +0000 UTC" firstStartedPulling="2026-04-17 15:19:17.485197397 +0000 UTC m=+130.876739410" lastFinishedPulling="2026-04-17 15:19:18.943267513 +0000 UTC m=+132.334809525" observedRunningTime="2026-04-17 15:19:19.47137262 +0000 UTC m=+132.862914652" watchObservedRunningTime="2026-04-17 15:19:19.471519882 +0000 UTC m=+132.863061916" Apr 17 15:19:20.460634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:20.460594 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" event={"ID":"9080ca44-1027-491b-9bf1-12443cd3b452","Type":"ContainerStarted","Data":"a5b3cedd1b1d3159be21bb30c8a6faa732c1cc4bfc1dbc6469d937da347b0636"} Apr 17 15:19:20.475660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:20.475617 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" podStartSLOduration=1.067080523 podStartE2EDuration="3.475604836s" podCreationTimestamp="2026-04-17 15:19:17 +0000 UTC" firstStartedPulling="2026-04-17 15:19:17.594718353 +0000 UTC m=+130.986260369" lastFinishedPulling="2026-04-17 15:19:20.003242669 +0000 UTC m=+133.394784682" observedRunningTime="2026-04-17 15:19:20.474754489 +0000 UTC m=+133.866296525" watchObservedRunningTime="2026-04-17 15:19:20.475604836 +0000 UTC m=+133.867146871" Apr 17 15:19:21.100185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.100152 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct"] Apr 17 15:19:21.103268 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.103252 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" Apr 17 15:19:21.105935 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.105910 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:21.105935 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.105910 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 15:19:21.106120 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.106033 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5qr8j\"" Apr 17 15:19:21.112930 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.112909 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct"] Apr 17 15:19:21.214122 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.214090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5th\" (UniqueName: \"kubernetes.io/projected/7ba1a898-bdbb-4176-a7e2-3447d8e9254d-kube-api-access-fh5th\") pod \"migrator-74bb7799d9-fbbct\" (UID: \"7ba1a898-bdbb-4176-a7e2-3447d8e9254d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" Apr 17 15:19:21.314930 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.314894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5th\" (UniqueName: \"kubernetes.io/projected/7ba1a898-bdbb-4176-a7e2-3447d8e9254d-kube-api-access-fh5th\") pod \"migrator-74bb7799d9-fbbct\" (UID: \"7ba1a898-bdbb-4176-a7e2-3447d8e9254d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" Apr 17 15:19:21.322616 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.322588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5th\" (UniqueName: \"kubernetes.io/projected/7ba1a898-bdbb-4176-a7e2-3447d8e9254d-kube-api-access-fh5th\") pod \"migrator-74bb7799d9-fbbct\" (UID: \"7ba1a898-bdbb-4176-a7e2-3447d8e9254d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" Apr 17 15:19:21.412671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.412641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" Apr 17 15:19:21.522322 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:21.522259 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct"] Apr 17 15:19:21.524692 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:21.524657 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba1a898_bdbb_4176_a7e2_3447d8e9254d.slice/crio-e5d875a44b2eb849e48a4b6733e3a302464db260171226aaf9fde2cbfc12616b WatchSource:0}: Error finding container e5d875a44b2eb849e48a4b6733e3a302464db260171226aaf9fde2cbfc12616b: Status 404 returned error can't find the container with id e5d875a44b2eb849e48a4b6733e3a302464db260171226aaf9fde2cbfc12616b Apr 17 15:19:22.466242 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:22.466210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" event={"ID":"7ba1a898-bdbb-4176-a7e2-3447d8e9254d","Type":"ContainerStarted","Data":"e5d875a44b2eb849e48a4b6733e3a302464db260171226aaf9fde2cbfc12616b"} Apr 17 15:19:22.828175 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:22.828096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:22.828629 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:22.828250 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:22.828629 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:22.828352 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls podName:4261f15f-644e-4914-8e45-1bfa8a2447d7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:38.828330243 +0000 UTC m=+152.219872257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n8pd7" (UID: "4261f15f-644e-4914-8e45-1bfa8a2447d7") : secret "cluster-monitoring-operator-tls" not found Apr 17 15:19:22.929222 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:22.929192 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:22.929338 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:22.929291 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:22.929412 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:22.929348 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 15:19:22.929412 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:22.929404 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:38.929389457 +0000 UTC m=+152.320931474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : secret "router-metrics-certs-default" not found Apr 17 15:19:22.929509 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:22.929424 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle podName:080f2dec-182b-40e8-adf6-95cf8c5342c7 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:38.929410166 +0000 UTC m=+152.320952183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle") pod "router-default-58ccc558bb-xngk4" (UID: "080f2dec-182b-40e8-adf6-95cf8c5342c7") : configmap references non-existent config key: service-ca.crt Apr 17 15:19:23.470282 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:23.470244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" event={"ID":"7ba1a898-bdbb-4176-a7e2-3447d8e9254d","Type":"ContainerStarted","Data":"837dd3f3c83201d58cdd956bda956e84096cca0316ecfc56ed6050d7624b3a58"} Apr 17 15:19:23.470282 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:23.470284 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" event={"ID":"7ba1a898-bdbb-4176-a7e2-3447d8e9254d","Type":"ContainerStarted","Data":"6a3bab392ff464db9316c78f2df48d0ced5a5c317d02782d66a28074f664f2b7"} Apr 17 15:19:23.485362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:23.485299 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fbbct" podStartSLOduration=1.173286538 podStartE2EDuration="2.485287104s" podCreationTimestamp="2026-04-17 15:19:21 +0000 UTC" firstStartedPulling="2026-04-17 15:19:21.526491091 +0000 UTC m=+134.918033104" lastFinishedPulling="2026-04-17 15:19:22.838491648 +0000 UTC m=+136.230033670" observedRunningTime="2026-04-17 15:19:23.4840673 +0000 UTC m=+136.875609363" watchObservedRunningTime="2026-04-17 15:19:23.485287104 +0000 UTC m=+136.876829164" Apr 17 15:19:24.717105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.717069 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-422l2"] Apr 17 15:19:24.720040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.720024 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.722519 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.722497 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 15:19:24.722672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.722507 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 15:19:24.723608 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.723591 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pjppp\"" Apr 17 15:19:24.723704 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.723632 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 15:19:24.723704 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.723680 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 15:19:24.727130 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.727111 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-422l2"] Apr 17 15:19:24.847104 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.847078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-key\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.847104 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.847105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-cabundle\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.847256 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.847223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqfj\" (UniqueName: \"kubernetes.io/projected/8239c686-f453-4cdb-9e3e-95fad8b76663-kube-api-access-fsqfj\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.948363 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.948335 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqfj\" (UniqueName: \"kubernetes.io/projected/8239c686-f453-4cdb-9e3e-95fad8b76663-kube-api-access-fsqfj\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.948479 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.948397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-key\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.948479 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.948416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-cabundle\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.948981 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.948964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-cabundle\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.950788 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.950765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8239c686-f453-4cdb-9e3e-95fad8b76663-signing-key\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:24.959935 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:24.955991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqfj\" (UniqueName: \"kubernetes.io/projected/8239c686-f453-4cdb-9e3e-95fad8b76663-kube-api-access-fsqfj\") pod \"service-ca-865cb79987-422l2\" (UID: \"8239c686-f453-4cdb-9e3e-95fad8b76663\") " pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:25.028793 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:25.028719 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-422l2" Apr 17 15:19:25.136764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:25.136740 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-422l2"] Apr 17 15:19:25.138928 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:25.138902 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8239c686_f453_4cdb_9e3e_95fad8b76663.slice/crio-37793a01989827f11aa9dddb0e9a6c4de07c44e855a7807a35cf1c0cdd7f5c9f WatchSource:0}: Error finding container 37793a01989827f11aa9dddb0e9a6c4de07c44e855a7807a35cf1c0cdd7f5c9f: Status 404 returned error can't find the container with id 37793a01989827f11aa9dddb0e9a6c4de07c44e855a7807a35cf1c0cdd7f5c9f Apr 17 15:19:25.474879 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:25.474844 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-422l2" event={"ID":"8239c686-f453-4cdb-9e3e-95fad8b76663","Type":"ContainerStarted","Data":"37793a01989827f11aa9dddb0e9a6c4de07c44e855a7807a35cf1c0cdd7f5c9f"} Apr 17 15:19:27.481781 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:27.481748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-422l2" event={"ID":"8239c686-f453-4cdb-9e3e-95fad8b76663","Type":"ContainerStarted","Data":"61a29d24ee3626baeb21322f7a56dab9cced8f26510a35dae03ad4e4927a95c5"} Apr 17 15:19:27.498436 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:27.498391 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-422l2" podStartSLOduration=1.58262816 podStartE2EDuration="3.498377955s" podCreationTimestamp="2026-04-17 15:19:24 +0000 UTC" firstStartedPulling="2026-04-17 15:19:25.140707251 +0000 UTC m=+138.532249264" lastFinishedPulling="2026-04-17 15:19:27.056457041 +0000 UTC m=+140.447999059" observedRunningTime="2026-04-17 15:19:27.49645101 +0000 UTC m=+140.887993046" watchObservedRunningTime="2026-04-17 15:19:27.498377955 +0000 UTC m=+140.889919989" Apr 17 15:19:38.874651 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.874608 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:38.877104 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.877069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4261f15f-644e-4914-8e45-1bfa8a2447d7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n8pd7\" (UID: \"4261f15f-644e-4914-8e45-1bfa8a2447d7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:38.883798 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.883778 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" Apr 17 15:19:38.975427 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.975393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:38.975558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.975535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:38.975940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.975921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080f2dec-182b-40e8-adf6-95cf8c5342c7-service-ca-bundle\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:38.977988 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.977964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/080f2dec-182b-40e8-adf6-95cf8c5342c7-metrics-certs\") pod \"router-default-58ccc558bb-xngk4\" (UID: \"080f2dec-182b-40e8-adf6-95cf8c5342c7\") " pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:38.994782 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.994758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:38.997707 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:38.997683 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7"] Apr 17 15:19:39.005846 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:39.005823 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4261f15f_644e_4914_8e45_1bfa8a2447d7.slice/crio-60666317cfc1d28466142051a9485af6d73f5d0b72d84f1687da1650b743d19c WatchSource:0}: Error finding container 60666317cfc1d28466142051a9485af6d73f5d0b72d84f1687da1650b743d19c: Status 404 returned error can't find the container with id 60666317cfc1d28466142051a9485af6d73f5d0b72d84f1687da1650b743d19c Apr 17 15:19:39.112190 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.112171 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58ccc558bb-xngk4"] Apr 17 15:19:39.113654 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:39.113629 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080f2dec_182b_40e8_adf6_95cf8c5342c7.slice/crio-eea3d332b393dd666143645dcaa72e93a28a6a8533cc23c7c2f86e6752b19dff WatchSource:0}: Error finding container eea3d332b393dd666143645dcaa72e93a28a6a8533cc23c7c2f86e6752b19dff: Status 404 returned error can't find the container with id eea3d332b393dd666143645dcaa72e93a28a6a8533cc23c7c2f86e6752b19dff Apr 17 15:19:39.511732 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.511637 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58ccc558bb-xngk4" event={"ID":"080f2dec-182b-40e8-adf6-95cf8c5342c7","Type":"ContainerStarted","Data":"ac17cd9f6aaad8a215f6994217d5e4ed01d33035525be509ee158864c0669509"} Apr 17 15:19:39.511732 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.511676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58ccc558bb-xngk4" event={"ID":"080f2dec-182b-40e8-adf6-95cf8c5342c7","Type":"ContainerStarted","Data":"eea3d332b393dd666143645dcaa72e93a28a6a8533cc23c7c2f86e6752b19dff"} Apr 17 15:19:39.512658 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.512632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" event={"ID":"4261f15f-644e-4914-8e45-1bfa8a2447d7","Type":"ContainerStarted","Data":"60666317cfc1d28466142051a9485af6d73f5d0b72d84f1687da1650b743d19c"} Apr 17 15:19:39.528709 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.528656 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-58ccc558bb-xngk4" podStartSLOduration=32.528642478 podStartE2EDuration="32.528642478s" podCreationTimestamp="2026-04-17 15:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:19:39.527729883 +0000 UTC m=+152.919271917" watchObservedRunningTime="2026-04-17 15:19:39.528642478 +0000 UTC m=+152.920184553" Apr 17 15:19:39.995679 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.995643 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:39.997927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:39.997907 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:40.515330 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:40.515288 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:40.516428 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:40.516408 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-58ccc558bb-xngk4" Apr 17 15:19:42.521867 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:42.521822 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" event={"ID":"4261f15f-644e-4914-8e45-1bfa8a2447d7","Type":"ContainerStarted","Data":"f56ee04f36e79028256195b4545a7fb455a788a0cf4b802394c16c18c3b508e9"} Apr 17 15:19:42.537527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:42.537478 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n8pd7" podStartSLOduration=32.919611977 podStartE2EDuration="35.537463405s" podCreationTimestamp="2026-04-17 15:19:07 +0000 UTC" firstStartedPulling="2026-04-17 15:19:39.008053675 +0000 UTC m=+152.399595689" lastFinishedPulling="2026-04-17 15:19:41.625905091 +0000 UTC m=+155.017447117" observedRunningTime="2026-04-17 15:19:42.536378631 +0000 UTC m=+155.927920671" watchObservedRunningTime="2026-04-17 15:19:42.537463405 +0000 UTC m=+155.929005439" Apr 17 15:19:46.084482 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:46.084444 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" Apr 17 15:19:46.108538 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:46.108513 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" podUID="cf60d958-a515-4c65-8fd2-9bd9d19fa3ab" Apr 17 15:19:46.144190 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:46.144163 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jpn8m" podUID="eeb34dd9-f023-4a23-8830-151d5b605625" Apr 17 15:19:46.163405 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:46.163376 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-78c86" podUID="b2ad6199-9c63-412c-b433-b95e9dec556b" Apr 17 15:19:46.167497 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:46.167479 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-n8fjz" podUID="48990b07-a036-41ef-a6cd-89d7520c417c" Apr 17 15:19:46.534016 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.533994 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:19:46.534149 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.534028 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:19:46.616707 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.616680 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gs7lp"] Apr 17 15:19:46.622386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.622363 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.625812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.625787 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 15:19:46.625812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.625805 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ctp72\"" Apr 17 15:19:46.625974 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.625834 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 15:19:46.630238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.630218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gs7lp"] Apr 17 15:19:46.634493 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.634467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737f8ec5-4bba-47e1-95e8-4699679c5200-crio-socket\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.634603 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.634578 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737f8ec5-4bba-47e1-95e8-4699679c5200-data-volume\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.634680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.634668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737f8ec5-4bba-47e1-95e8-4699679c5200-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.635235 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.634809 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhk6l\" (UniqueName: \"kubernetes.io/projected/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-api-access-dhk6l\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.635235 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.634930 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735591 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhk6l\" (UniqueName: \"kubernetes.io/projected/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-api-access-dhk6l\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735591 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735844 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737f8ec5-4bba-47e1-95e8-4699679c5200-crio-socket\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735925 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737f8ec5-4bba-47e1-95e8-4699679c5200-data-volume\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737f8ec5-4bba-47e1-95e8-4699679c5200-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.735984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.735909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737f8ec5-4bba-47e1-95e8-4699679c5200-crio-socket\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.736178 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.736160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737f8ec5-4bba-47e1-95e8-4699679c5200-data-volume\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.736238 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.736161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.738137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.738117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737f8ec5-4bba-47e1-95e8-4699679c5200-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.748035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.748014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhk6l\" (UniqueName: \"kubernetes.io/projected/737f8ec5-4bba-47e1-95e8-4699679c5200-kube-api-access-dhk6l\") pod \"insights-runtime-extractor-gs7lp\" (UID: \"737f8ec5-4bba-47e1-95e8-4699679c5200\") " pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:46.933169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:46.933136 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gs7lp" Apr 17 15:19:47.073760 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:47.073559 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gs7lp"] Apr 17 15:19:47.076549 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:47.076518 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737f8ec5_4bba_47e1_95e8_4699679c5200.slice/crio-31b27473f557b312eb805fc16557b4898ed69499e37c4ec9e8e1fa40abf71e9e WatchSource:0}: Error finding container 31b27473f557b312eb805fc16557b4898ed69499e37c4ec9e8e1fa40abf71e9e: Status 404 returned error can't find the container with id 31b27473f557b312eb805fc16557b4898ed69499e37c4ec9e8e1fa40abf71e9e Apr 17 15:19:47.538044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:47.538009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gs7lp" event={"ID":"737f8ec5-4bba-47e1-95e8-4699679c5200","Type":"ContainerStarted","Data":"4870451b2accf0e2102282596c3f5241b13c8ff11813c4d8d4de85fee1759afd"} Apr 17 15:19:47.538044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:47.538051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gs7lp" event={"ID":"737f8ec5-4bba-47e1-95e8-4699679c5200","Type":"ContainerStarted","Data":"31b27473f557b312eb805fc16557b4898ed69499e37c4ec9e8e1fa40abf71e9e"} Apr 17 15:19:48.542579 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:48.542544 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gs7lp" event={"ID":"737f8ec5-4bba-47e1-95e8-4699679c5200","Type":"ContainerStarted","Data":"52ea2d4a3071392b99941e08d7abcd91261cf707545b87ab9a599f8558df7826"} Apr 17 15:19:49.546954 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:49.546867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gs7lp" event={"ID":"737f8ec5-4bba-47e1-95e8-4699679c5200","Type":"ContainerStarted","Data":"e65a07808f67500a0b8c6a9b4c95618cbb956653d257bc4c01b0ca4ca2b08a4e"} Apr 17 15:19:49.567186 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:49.567134 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gs7lp" podStartSLOduration=1.558222734 podStartE2EDuration="3.567121407s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="2026-04-17 15:19:47.132006057 +0000 UTC m=+160.523548070" lastFinishedPulling="2026-04-17 15:19:49.140904716 +0000 UTC m=+162.532446743" observedRunningTime="2026-04-17 15:19:49.566501523 +0000 UTC m=+162.958043559" watchObservedRunningTime="2026-04-17 15:19:49.567121407 +0000 UTC m=+162.958663441" Apr 17 15:19:50.973873 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:50.973841 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:19:50.976236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:50.976213 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"image-registry-7c649d5796-5gdb8\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:19:51.037589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.037564 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jck6\"" Apr 17 15:19:51.045549 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.045532 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:19:51.074598 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.074564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:19:51.074770 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.074614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:19:51.074770 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.074645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:19:51.076908 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.076885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb34dd9-f023-4a23-8830-151d5b605625-cert\") pod \"ingress-canary-jpn8m\" (UID: \"eeb34dd9-f023-4a23-8830-151d5b605625\") " pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:19:51.077036 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.076936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cf60d958-a515-4c65-8fd2-9bd9d19fa3ab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmx8c\" (UID: \"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:19:51.077099 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.077039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2ad6199-9c63-412c-b433-b95e9dec556b-metrics-tls\") pod \"dns-default-78c86\" (UID: \"b2ad6199-9c63-412c-b433-b95e9dec556b\") " pod="openshift-dns/dns-default-78c86" Apr 17 15:19:51.164417 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.164393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:19:51.166752 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:51.166725 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8571a62a_9563_470f_aa7c_a31197ec34fd.slice/crio-a521a6d621d80c5cd495f682ae6f3cbc26394acb64e742649a4c3a06a9f4a961 WatchSource:0}: Error finding container a521a6d621d80c5cd495f682ae6f3cbc26394acb64e742649a4c3a06a9f4a961: Status 404 returned error can't find the container with id a521a6d621d80c5cd495f682ae6f3cbc26394acb64e742649a4c3a06a9f4a961 Apr 17 15:19:51.337915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.337843 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:19:51.345754 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.345738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpn8m" Apr 17 15:19:51.465950 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.465862 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpn8m"] Apr 17 15:19:51.468199 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:51.468174 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb34dd9_f023_4a23_8830_151d5b605625.slice/crio-3a16a25b5af6a35167df4989c4f03c95df4276099d50be4fe82d18ee32ae6e36 WatchSource:0}: Error finding container 3a16a25b5af6a35167df4989c4f03c95df4276099d50be4fe82d18ee32ae6e36: Status 404 returned error can't find the container with id 3a16a25b5af6a35167df4989c4f03c95df4276099d50be4fe82d18ee32ae6e36 Apr 17 15:19:51.553554 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.553522 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpn8m" event={"ID":"eeb34dd9-f023-4a23-8830-151d5b605625","Type":"ContainerStarted","Data":"3a16a25b5af6a35167df4989c4f03c95df4276099d50be4fe82d18ee32ae6e36"} Apr 17 15:19:51.554777 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.554751 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" event={"ID":"8571a62a-9563-470f-aa7c-a31197ec34fd","Type":"ContainerStarted","Data":"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd"} Apr 17 15:19:51.554866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.554781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" event={"ID":"8571a62a-9563-470f-aa7c-a31197ec34fd","Type":"ContainerStarted","Data":"a521a6d621d80c5cd495f682ae6f3cbc26394acb64e742649a4c3a06a9f4a961"} Apr 17 15:19:51.554928 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.554884 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:19:51.574057 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:51.574019 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" podStartSLOduration=164.57400854 podStartE2EDuration="2m44.57400854s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:19:51.572408038 +0000 UTC m=+164.963950085" watchObservedRunningTime="2026-04-17 15:19:51.57400854 +0000 UTC m=+164.965550575" Apr 17 15:19:53.561079 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:53.561036 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpn8m" event={"ID":"eeb34dd9-f023-4a23-8830-151d5b605625","Type":"ContainerStarted","Data":"867e16d8e2dd66b31fdbc98d0cb9ecb6b60bf659b892cf1d36f276eecb2d1580"} Apr 17 15:19:53.576885 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:53.576839 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jpn8m" podStartSLOduration=128.609785583 podStartE2EDuration="2m10.57682715s" podCreationTimestamp="2026-04-17 15:17:43 +0000 UTC" firstStartedPulling="2026-04-17 15:19:51.470055769 +0000 UTC m=+164.861597791" lastFinishedPulling="2026-04-17 15:19:53.437097345 +0000 UTC m=+166.828639358" observedRunningTime="2026-04-17 15:19:53.575911173 +0000 UTC m=+166.967453221" watchObservedRunningTime="2026-04-17 15:19:53.57682715 +0000 UTC m=+166.968369184" Apr 17 15:19:54.522329 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.521869 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gct64"] Apr 17 15:19:54.527658 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.527632 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.530173 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.530133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 15:19:54.530354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.530265 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 15:19:54.530354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.530340 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-z5rpz\"" Apr 17 15:19:54.531232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.531212 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 15:19:54.535706 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.535292 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gct64"] Apr 17 15:19:54.539383 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.539361 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7vvjb"] Apr 17 15:19:54.542520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.542504 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.545953 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.545932 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 15:19:54.545953 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.545944 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 15:19:54.546253 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.546238 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 15:19:54.546337 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.546251 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-95j6v\"" Apr 17 15:19:54.604322 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-metrics-client-ca\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-wtmp\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff092072-7608-4ec7-8227-561dd01cd73c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xsv\" (UniqueName: \"kubernetes.io/projected/ff092072-7608-4ec7-8227-561dd01cd73c-kube-api-access-96xsv\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-root\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.604961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-tls\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-sys\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztr48\" (UniqueName: \"kubernetes.io/projected/522b9c9d-a847-4b8a-971b-6f6ac840eae0-kube-api-access-ztr48\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.604961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.604869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-textfile\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-root\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.705602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-tls\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-sys\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-root\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705602 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztr48\" (UniqueName: \"kubernetes.io/projected/522b9c9d-a847-4b8a-971b-6f6ac840eae0-kube-api-access-ztr48\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-textfile\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-metrics-client-ca\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-wtmp\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff092072-7608-4ec7-8227-561dd01cd73c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705848 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96xsv\" (UniqueName: \"kubernetes.io/projected/ff092072-7608-4ec7-8227-561dd01cd73c-kube-api-access-96xsv\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.705863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705853 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-sys\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.706244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.706244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.705975 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-wtmp\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.706244 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:54.706066 2567 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 15:19:54.706244 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:54.706155 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls podName:ff092072-7608-4ec7-8227-561dd01cd73c nodeName:}" failed. No retries permitted until 2026-04-17 15:19:55.206134897 +0000 UTC m=+168.597676933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-gct64" (UID: "ff092072-7608-4ec7-8227-561dd01cd73c") : secret "openshift-state-metrics-tls" not found Apr 17 15:19:54.706528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.706245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-textfile\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.706804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.706784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-metrics-client-ca\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.707054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.707029 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.707160 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.707136 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff092072-7608-4ec7-8227-561dd01cd73c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.709023 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.708994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-tls\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.709130 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.709088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/522b9c9d-a847-4b8a-971b-6f6ac840eae0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.709214 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.708997 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.714990 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.714963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztr48\" (UniqueName: \"kubernetes.io/projected/522b9c9d-a847-4b8a-971b-6f6ac840eae0-kube-api-access-ztr48\") pod \"node-exporter-7vvjb\" (UID: \"522b9c9d-a847-4b8a-971b-6f6ac840eae0\") " pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.715700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.715679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xsv\" (UniqueName: \"kubernetes.io/projected/ff092072-7608-4ec7-8227-561dd01cd73c-kube-api-access-96xsv\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:54.852056 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:54.851972 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vvjb" Apr 17 15:19:54.859899 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:54.859868 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522b9c9d_a847_4b8a_971b_6f6ac840eae0.slice/crio-260145dbe50e82854e4d226b6075ec73d2255745312fa92c73ac758dc338ff4f WatchSource:0}: Error finding container 260145dbe50e82854e4d226b6075ec73d2255745312fa92c73ac758dc338ff4f: Status 404 returned error can't find the container with id 260145dbe50e82854e4d226b6075ec73d2255745312fa92c73ac758dc338ff4f Apr 17 15:19:55.209776 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.209739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:55.212095 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.212075 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff092072-7608-4ec7-8227-561dd01cd73c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gct64\" (UID: \"ff092072-7608-4ec7-8227-561dd01cd73c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:55.438439 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.438408 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" Apr 17 15:19:55.567475 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.567437 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vvjb" event={"ID":"522b9c9d-a847-4b8a-971b-6f6ac840eae0","Type":"ContainerStarted","Data":"260145dbe50e82854e4d226b6075ec73d2255745312fa92c73ac758dc338ff4f"} Apr 17 15:19:55.580779 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.580753 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gct64"] Apr 17 15:19:55.611396 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.611366 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:19:55.615377 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.615355 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.617671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.617646 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 15:19:55.617807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.617670 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 15:19:55.617807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.617796 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 15:19:55.617965 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.617915 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 15:19:55.617965 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.617951 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 15:19:55.618047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.618017 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-r4jt7\"" Apr 17 15:19:55.618154 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.618133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 15:19:55.618218 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.618172 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 15:19:55.618366 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.618233 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 15:19:55.618442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.618374 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 15:19:55.628506 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.628485 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:19:55.713124 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713166 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713398 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dq68\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713521 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.713758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.713624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.741306 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:55.741275 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff092072_7608_4ec7_8227_561dd01cd73c.slice/crio-13c730a8e3c8d0ae36559296e02acb7aa3806c9336a81de21a1cac4a7509b8c4 WatchSource:0}: Error finding container 13c730a8e3c8d0ae36559296e02acb7aa3806c9336a81de21a1cac4a7509b8c4: Status 404 returned error can't find the container with id 13c730a8e3c8d0ae36559296e02acb7aa3806c9336a81de21a1cac4a7509b8c4 Apr 17 15:19:55.814826 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.814797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.814933 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.814839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.814992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.814952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815043 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815103 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815051 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815158 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815158 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815136 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815260 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dq68\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815260 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.815384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.816442 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:19:55.815879 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle podName:e6f3d475-a740-4879-84a4-0bbd9dda11e5 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:56.315855875 +0000 UTC m=+169.707397895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5") : configmap references non-existent config key: ca-bundle.crt Apr 17 15:19:55.816442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.815894 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.816442 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.816173 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.817610 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.817384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.819764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.819739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.820547 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.820524 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.820749 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.820725 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.821846 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.821353 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.821959 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.821908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.822246 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.822222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.822608 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.822476 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.823629 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.823609 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:55.827793 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:55.827766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dq68\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:56.319624 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.319592 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:56.320583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.320564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:56.527368 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.527325 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:19:56.571544 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.571454 2567 generic.go:358] "Generic (PLEG): container finished" podID="522b9c9d-a847-4b8a-971b-6f6ac840eae0" containerID="2aceedbc4002ec61e5095705d2aeccf44689cef2e9c465a4c9143e24a5310ee3" exitCode=0 Apr 17 15:19:56.571544 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.571501 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vvjb" event={"ID":"522b9c9d-a847-4b8a-971b-6f6ac840eae0","Type":"ContainerDied","Data":"2aceedbc4002ec61e5095705d2aeccf44689cef2e9c465a4c9143e24a5310ee3"} Apr 17 15:19:56.575382 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.575209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" event={"ID":"ff092072-7608-4ec7-8227-561dd01cd73c","Type":"ContainerStarted","Data":"1e09f2504ce4b9d1dcaf0d0ec4b5a5afdb91802a43d95959fb7dd6d2a6822d19"} Apr 17 15:19:56.575382 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.575254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" event={"ID":"ff092072-7608-4ec7-8227-561dd01cd73c","Type":"ContainerStarted","Data":"1d448b12b55545891e6b0e6cb3c248b9d363038d9c70604d73cc08c71bfa3451"} Apr 17 15:19:56.575680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.575269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" event={"ID":"ff092072-7608-4ec7-8227-561dd01cd73c","Type":"ContainerStarted","Data":"13c730a8e3c8d0ae36559296e02acb7aa3806c9336a81de21a1cac4a7509b8c4"} Apr 17 15:19:56.657088 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:56.657062 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:19:56.660130 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:56.660101 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f3d475_a740_4879_84a4_0bbd9dda11e5.slice/crio-24c1b22bbbbed99ed1d49c154e0c950a0bc8db19c2336d254a0fdbd453128a9d WatchSource:0}: Error finding container 24c1b22bbbbed99ed1d49c154e0c950a0bc8db19c2336d254a0fdbd453128a9d: Status 404 returned error can't find the container with id 24c1b22bbbbed99ed1d49c154e0c950a0bc8db19c2336d254a0fdbd453128a9d Apr 17 15:19:57.113114 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.113081 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:19:57.580148 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.580106 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"24c1b22bbbbed99ed1d49c154e0c950a0bc8db19c2336d254a0fdbd453128a9d"} Apr 17 15:19:57.582285 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.582241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vvjb" event={"ID":"522b9c9d-a847-4b8a-971b-6f6ac840eae0","Type":"ContainerStarted","Data":"054f91132d44cd5c87dbd6a021fa682e638d924ca8f16e35491864089a49fc42"} Apr 17 15:19:57.582285 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.582279 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vvjb" event={"ID":"522b9c9d-a847-4b8a-971b-6f6ac840eae0","Type":"ContainerStarted","Data":"6a86714731a272f6d7342769a3616f164e01920e2876d5e28064a9051c584f4e"} Apr 17 15:19:57.584422 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.584395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" event={"ID":"ff092072-7608-4ec7-8227-561dd01cd73c","Type":"ContainerStarted","Data":"d60507c502432681b955a913dae219fa11a83c48dff3987b36fcc1bd33838010"} Apr 17 15:19:57.599545 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.599495 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7vvjb" podStartSLOduration=2.675000967 podStartE2EDuration="3.599478975s" podCreationTimestamp="2026-04-17 15:19:54 +0000 UTC" firstStartedPulling="2026-04-17 15:19:54.861577665 +0000 UTC m=+168.253119677" lastFinishedPulling="2026-04-17 15:19:55.786055663 +0000 UTC m=+169.177597685" observedRunningTime="2026-04-17 15:19:57.598579609 +0000 UTC m=+170.990121643" watchObservedRunningTime="2026-04-17 15:19:57.599478975 +0000 UTC m=+170.991021011" Apr 17 15:19:57.614458 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:57.614407 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gct64" podStartSLOduration=2.325111256 podStartE2EDuration="3.614390275s" podCreationTimestamp="2026-04-17 15:19:54 +0000 UTC" firstStartedPulling="2026-04-17 15:19:55.879522522 +0000 UTC m=+169.271064540" lastFinishedPulling="2026-04-17 15:19:57.168801534 +0000 UTC m=+170.560343559" observedRunningTime="2026-04-17 15:19:57.614163391 +0000 UTC m=+171.005705428" watchObservedRunningTime="2026-04-17 15:19:57.614390275 +0000 UTC m=+171.005932314" Apr 17 15:19:58.109094 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.109005 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:19:58.111627 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.111606 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hcrsv\"" Apr 17 15:19:58.119543 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.119527 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" Apr 17 15:19:58.230098 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.230068 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c"] Apr 17 15:19:58.232894 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:58.232872 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf60d958_a515_4c65_8fd2_9bd9d19fa3ab.slice/crio-7461993e8576e587fdb4dfe073fd7b938803d8494469db726a6d365c22698a4c WatchSource:0}: Error finding container 7461993e8576e587fdb4dfe073fd7b938803d8494469db726a6d365c22698a4c: Status 404 returned error can't find the container with id 7461993e8576e587fdb4dfe073fd7b938803d8494469db726a6d365c22698a4c Apr 17 15:19:58.588345 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.588290 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1" exitCode=0 Apr 17 15:19:58.588529 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.588360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1"} Apr 17 15:19:58.589546 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:58.589521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" event={"ID":"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab","Type":"ContainerStarted","Data":"7461993e8576e587fdb4dfe073fd7b938803d8494469db726a6d365c22698a4c"} Apr 17 15:19:59.109173 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:59.109140 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-78c86" Apr 17 15:19:59.111875 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:59.111854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:19:59.120000 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:59.119983 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-78c86" Apr 17 15:19:59.263604 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:59.262165 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-78c86"] Apr 17 15:19:59.266739 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:19:59.266710 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ad6199_9c63_412c_b433_b95e9dec556b.slice/crio-2d5808d4a4bb7fc43498886c3d17d411e1135474097feac46a3c9b92d26e5c84 WatchSource:0}: Error finding container 2d5808d4a4bb7fc43498886c3d17d411e1135474097feac46a3c9b92d26e5c84: Status 404 returned error can't find the container with id 2d5808d4a4bb7fc43498886c3d17d411e1135474097feac46a3c9b92d26e5c84 Apr 17 15:19:59.594011 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:19:59.593969 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-78c86" event={"ID":"b2ad6199-9c63-412c-b433-b95e9dec556b","Type":"ContainerStarted","Data":"2d5808d4a4bb7fc43498886c3d17d411e1135474097feac46a3c9b92d26e5c84"} Apr 17 15:20:00.599406 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:00.599306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87"} Apr 17 15:20:00.599406 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:00.599368 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22"} Apr 17 15:20:00.600866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:00.600826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" event={"ID":"cf60d958-a515-4c65-8fd2-9bd9d19fa3ab","Type":"ContainerStarted","Data":"8dd9e860c6d7505a606d567320643e6c819a0183291cda6eceb546168618ca68"} Apr 17 15:20:00.615353 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:00.615295 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmx8c" podStartSLOduration=170.661411406 podStartE2EDuration="2m52.615281915s" podCreationTimestamp="2026-04-17 15:17:08 +0000 UTC" firstStartedPulling="2026-04-17 15:19:58.234766278 +0000 UTC m=+171.626308291" lastFinishedPulling="2026-04-17 15:20:00.188636784 +0000 UTC m=+173.580178800" observedRunningTime="2026-04-17 15:20:00.614484414 +0000 UTC m=+174.006026449" watchObservedRunningTime="2026-04-17 15:20:00.615281915 +0000 UTC m=+174.006823950" Apr 17 15:20:01.606443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.606405 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da"} Apr 17 15:20:01.606443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.606443 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361"} Apr 17 15:20:01.606443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.606452 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7"} Apr 17 15:20:01.607762 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.607738 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-78c86" event={"ID":"b2ad6199-9c63-412c-b433-b95e9dec556b","Type":"ContainerStarted","Data":"0a8c6580f1ee6e4c660648154604f02cccae796a0b7fb5067f2b333fc8f4c36b"} Apr 17 15:20:01.607848 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.607769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-78c86" event={"ID":"b2ad6199-9c63-412c-b433-b95e9dec556b","Type":"ContainerStarted","Data":"b6412a4d4671e1fdb0c903ab5e8ee7b24a455396db9b7bc6f27993be86c714d3"} Apr 17 15:20:01.628713 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:01.628673 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-78c86" podStartSLOduration=137.12707917 podStartE2EDuration="2m18.628660108s" podCreationTimestamp="2026-04-17 15:17:43 +0000 UTC" firstStartedPulling="2026-04-17 15:19:59.269006221 +0000 UTC m=+172.660548236" lastFinishedPulling="2026-04-17 15:20:00.77058716 +0000 UTC m=+174.162129174" observedRunningTime="2026-04-17 15:20:01.627028732 +0000 UTC m=+175.018570804" watchObservedRunningTime="2026-04-17 15:20:01.628660108 +0000 UTC m=+175.020202181" Apr 17 15:20:02.613150 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:02.613113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerStarted","Data":"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb"} Apr 17 15:20:02.613619 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:02.613446 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-78c86" Apr 17 15:20:02.644576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:02.644531 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.297295894 podStartE2EDuration="7.644517754s" podCreationTimestamp="2026-04-17 15:19:55 +0000 UTC" firstStartedPulling="2026-04-17 15:19:56.661777447 +0000 UTC m=+170.053319461" lastFinishedPulling="2026-04-17 15:20:02.008999292 +0000 UTC m=+175.400541321" observedRunningTime="2026-04-17 15:20:02.642575017 +0000 UTC m=+176.034117052" watchObservedRunningTime="2026-04-17 15:20:02.644517754 +0000 UTC m=+176.036059788" Apr 17 15:20:11.050362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:11.050306 2567 patch_prober.go:28] interesting pod/image-registry-7c649d5796-5gdb8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 15:20:11.050713 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:11.050384 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 15:20:12.562115 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:12.562087 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:20:12.618053 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:12.618026 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-78c86" Apr 17 15:20:18.720675 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:18.720641 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:20:25.268940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.268909 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:20:25.270956 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.270932 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.273378 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.273359 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 15:20:25.273467 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.273422 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 15:20:25.273467 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.273439 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 15:20:25.274663 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.274637 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 15:20:25.274771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.274648 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 15:20:25.274771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.274650 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 15:20:25.274771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.274649 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 15:20:25.274771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.274650 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w8shx\"" Apr 17 15:20:25.280094 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.280076 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:20:25.468248 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.468586 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.468700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468599 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.468700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.468700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.468825 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.468705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qth\" (UniqueName: \"kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569605 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569605 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569605 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qth\" (UniqueName: \"kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569728 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.569866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.569749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.570414 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.570385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.570532 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.570389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.570532 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.570452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.572039 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.572016 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.572203 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.572183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.576993 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.576971 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qth\" (UniqueName: \"kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth\") pod \"console-7d56b4868b-5xnp7\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.580632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.580612 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:25.679114 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.679078 2567 generic.go:358] "Generic (PLEG): container finished" podID="9080ca44-1027-491b-9bf1-12443cd3b452" containerID="a5b3cedd1b1d3159be21bb30c8a6faa732c1cc4bfc1dbc6469d937da347b0636" exitCode=0 Apr 17 15:20:25.679258 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.679156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" event={"ID":"9080ca44-1027-491b-9bf1-12443cd3b452","Type":"ContainerDied","Data":"a5b3cedd1b1d3159be21bb30c8a6faa732c1cc4bfc1dbc6469d937da347b0636"} Apr 17 15:20:25.679657 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.679640 2567 scope.go:117] "RemoveContainer" containerID="a5b3cedd1b1d3159be21bb30c8a6faa732c1cc4bfc1dbc6469d937da347b0636" Apr 17 15:20:25.701255 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:25.701226 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:20:25.703940 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:20:25.703917 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d88c69_b00c_4c2f_9518_80fcc8bee111.slice/crio-02c2d77777c3ae57700d98cadfe1620c1db7c4871d2b41c2fe37fa600df54402 WatchSource:0}: Error finding container 02c2d77777c3ae57700d98cadfe1620c1db7c4871d2b41c2fe37fa600df54402: Status 404 returned error can't find the container with id 02c2d77777c3ae57700d98cadfe1620c1db7c4871d2b41c2fe37fa600df54402 Apr 17 15:20:26.684085 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:26.684046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d56b4868b-5xnp7" event={"ID":"19d88c69-b00c-4c2f-9518-80fcc8bee111","Type":"ContainerStarted","Data":"02c2d77777c3ae57700d98cadfe1620c1db7c4871d2b41c2fe37fa600df54402"} Apr 17 15:20:26.685918 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:26.685893 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hchf4" event={"ID":"9080ca44-1027-491b-9bf1-12443cd3b452","Type":"ContainerStarted","Data":"8254a08c10d998b9d4c11810a104949909ae91b74c13a2df35956640633f6e28"} Apr 17 15:20:29.696385 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:29.696349 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d56b4868b-5xnp7" event={"ID":"19d88c69-b00c-4c2f-9518-80fcc8bee111","Type":"ContainerStarted","Data":"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0"} Apr 17 15:20:29.713636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:29.713595 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d56b4868b-5xnp7" podStartSLOduration=1.4921658340000001 podStartE2EDuration="4.713581044s" podCreationTimestamp="2026-04-17 15:20:25 +0000 UTC" firstStartedPulling="2026-04-17 15:20:25.705641616 +0000 UTC m=+199.097183637" lastFinishedPulling="2026-04-17 15:20:28.927056828 +0000 UTC m=+202.318598847" observedRunningTime="2026-04-17 15:20:29.711986624 +0000 UTC m=+203.103528659" watchObservedRunningTime="2026-04-17 15:20:29.713581044 +0000 UTC m=+203.105123078" Apr 17 15:20:32.705680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:32.705647 2567 generic.go:358] "Generic (PLEG): container finished" podID="4270e3fe-b069-4a89-bd6d-10514be6fb65" containerID="e6c4d07455cb865fab8bd60310eae9d48aee9d122bed51bd843b5a84a2a7074a" exitCode=0 Apr 17 15:20:32.706064 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:32.705714 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" event={"ID":"4270e3fe-b069-4a89-bd6d-10514be6fb65","Type":"ContainerDied","Data":"e6c4d07455cb865fab8bd60310eae9d48aee9d122bed51bd843b5a84a2a7074a"} Apr 17 15:20:32.706105 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:32.706061 2567 scope.go:117] "RemoveContainer" containerID="e6c4d07455cb865fab8bd60310eae9d48aee9d122bed51bd843b5a84a2a7074a" Apr 17 15:20:33.448073 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:33.448047 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/init-config-reloader/0.log" Apr 17 15:20:33.453441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:33.453415 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/alertmanager/0.log" Apr 17 15:20:33.604050 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:33.604023 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/config-reloader/0.log" Apr 17 15:20:33.710885 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:33.710811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wvmmd" event={"ID":"4270e3fe-b069-4a89-bd6d-10514be6fb65","Type":"ContainerStarted","Data":"ae3954b2f89bd3ba0451eaea0500b0953a941167f910f0a8c68bc7701ebfcd5c"} Apr 17 15:20:33.804433 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:33.804407 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/kube-rbac-proxy-web/0.log" Apr 17 15:20:34.004109 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:34.004019 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/kube-rbac-proxy/0.log" Apr 17 15:20:34.204462 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:34.204431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/kube-rbac-proxy-metric/0.log" Apr 17 15:20:34.403824 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:34.403797 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e6f3d475-a740-4879-84a4-0bbd9dda11e5/prom-label-proxy/0.log" Apr 17 15:20:34.606565 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:34.606530 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-n8pd7_4261f15f-644e-4914-8e45-1bfa8a2447d7/cluster-monitoring-operator/0.log" Apr 17 15:20:35.581405 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:35.581374 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:35.581405 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:35.581412 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:35.586079 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:35.586056 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:35.719969 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:35.719941 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:20:35.804106 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:35.804076 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/init-textfile/0.log" Apr 17 15:20:36.004889 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:36.004857 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/node-exporter/0.log" Apr 17 15:20:36.203704 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:36.203679 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/kube-rbac-proxy/0.log" Apr 17 15:20:37.604517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:37.604490 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/kube-rbac-proxy-main/0.log" Apr 17 15:20:37.806630 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:37.806605 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/kube-rbac-proxy-self/0.log" Apr 17 15:20:38.004576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:38.004544 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/openshift-state-metrics/0.log" Apr 17 15:20:42.004043 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:42.004004 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lmx8c_cf60d958-a515-4c65-8fd2-9bd9d19fa3ab/networking-console-plugin/0.log" Apr 17 15:20:42.604334 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:42.604280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d56b4868b-5xnp7_19d88c69-b00c-4c2f-9518-80fcc8bee111/console/0.log" Apr 17 15:20:43.740192 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:43.740147 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerName="registry" containerID="cri-o://4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd" gracePeriod=30 Apr 17 15:20:43.981349 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:43.977117 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:20:44.035790 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.035705 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.035790 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.035773 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.035804 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.035844 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn9zs\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036116 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036047 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036212 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036150 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036212 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036187 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036396 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036229 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token\") pod \"8571a62a-9563-470f-aa7c-a31197ec34fd\" (UID: \"8571a62a-9563-470f-aa7c-a31197ec34fd\") " Apr 17 15:20:44.036453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036388 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:20:44.036580 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036562 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-certificates\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.036659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.036563 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:20:44.038705 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.038666 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:20:44.038836 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.038726 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs" (OuterVolumeSpecName: "kube-api-access-qn9zs") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "kube-api-access-qn9zs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:20:44.038900 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.038849 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:20:44.038944 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.038913 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:20:44.039857 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.039832 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:20:44.046735 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.046704 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8571a62a-9563-470f-aa7c-a31197ec34fd" (UID: "8571a62a-9563-470f-aa7c-a31197ec34fd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:20:44.137778 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137747 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8571a62a-9563-470f-aa7c-a31197ec34fd-trusted-ca\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137778 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137775 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8571a62a-9563-470f-aa7c-a31197ec34fd-ca-trust-extracted\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137778 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137786 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-installation-pull-secrets\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137796 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-bound-sa-token\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137805 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8571a62a-9563-470f-aa7c-a31197ec34fd-image-registry-private-configuration\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137814 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-registry-tls\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.137989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.137823 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qn9zs\" (UniqueName: \"kubernetes.io/projected/8571a62a-9563-470f-aa7c-a31197ec34fd-kube-api-access-qn9zs\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:20:44.743423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.743390 2567 generic.go:358] "Generic (PLEG): container finished" podID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerID="4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd" exitCode=0 Apr 17 15:20:44.743852 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.743466 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" event={"ID":"8571a62a-9563-470f-aa7c-a31197ec34fd","Type":"ContainerDied","Data":"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd"} Apr 17 15:20:44.743852 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.743502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" event={"ID":"8571a62a-9563-470f-aa7c-a31197ec34fd","Type":"ContainerDied","Data":"a521a6d621d80c5cd495f682ae6f3cbc26394acb64e742649a4c3a06a9f4a961"} Apr 17 15:20:44.743852 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.743516 2567 scope.go:117] "RemoveContainer" containerID="4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd" Apr 17 15:20:44.743852 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.743473 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c649d5796-5gdb8" Apr 17 15:20:44.752743 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.752726 2567 scope.go:117] "RemoveContainer" containerID="4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd" Apr 17 15:20:44.752992 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:20:44.752971 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd\": container with ID starting with 4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd not found: ID does not exist" containerID="4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd" Apr 17 15:20:44.753032 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.753000 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd"} err="failed to get container status \"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd\": rpc error: code = NotFound desc = could not find container \"4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd\": container with ID starting with 4a9555ff3cfebd9d16748e236d3d5d3f4f3b2cd69bee993efc5286dd4b7f80dd not found: ID does not exist" Apr 17 15:20:44.764437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.764417 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:20:44.767521 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:44.767487 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7c649d5796-5gdb8"] Apr 17 15:20:45.113864 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:45.113774 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" path="/var/lib/kubelet/pods/8571a62a-9563-470f-aa7c-a31197ec34fd/volumes" Apr 17 15:20:46.789801 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:20:46.789729 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:21:11.809391 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:11.809344 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d56b4868b-5xnp7" podUID="19d88c69-b00c-4c2f-9518-80fcc8bee111" containerName="console" containerID="cri-o://e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0" gracePeriod=15 Apr 17 15:21:12.049034 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.049012 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d56b4868b-5xnp7_19d88c69-b00c-4c2f-9518-80fcc8bee111/console/0.log" Apr 17 15:21:12.049155 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.049071 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:21:12.171341 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171293 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171385 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171420 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171436 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171462 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171520 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171501 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qth\" (UniqueName: \"kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth\") pod \"19d88c69-b00c-4c2f-9518-80fcc8bee111\" (UID: \"19d88c69-b00c-4c2f-9518-80fcc8bee111\") " Apr 17 15:21:12.171776 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171748 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config" (OuterVolumeSpecName: "console-config") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:12.171883 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171861 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca" (OuterVolumeSpecName: "service-ca") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:12.171935 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.171877 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:12.173718 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.173694 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth" (OuterVolumeSpecName: "kube-api-access-r5qth") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "kube-api-access-r5qth". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:12.173814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.173725 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:12.173814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.173744 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "19d88c69-b00c-4c2f-9518-80fcc8bee111" (UID: "19d88c69-b00c-4c2f-9518-80fcc8bee111"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:12.272744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272684 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5qth\" (UniqueName: \"kubernetes.io/projected/19d88c69-b00c-4c2f-9518-80fcc8bee111-kube-api-access-r5qth\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.272744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272741 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.272744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272751 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-service-ca\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.272927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272760 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-oauth-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.272927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272769 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d88c69-b00c-4c2f-9518-80fcc8bee111-console-serving-cert\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.272927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.272778 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19d88c69-b00c-4c2f-9518-80fcc8bee111-oauth-serving-cert\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:12.827399 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827371 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d56b4868b-5xnp7_19d88c69-b00c-4c2f-9518-80fcc8bee111/console/0.log" Apr 17 15:21:12.827870 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827410 2567 generic.go:358] "Generic (PLEG): container finished" podID="19d88c69-b00c-4c2f-9518-80fcc8bee111" containerID="e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0" exitCode=2 Apr 17 15:21:12.827870 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d56b4868b-5xnp7" event={"ID":"19d88c69-b00c-4c2f-9518-80fcc8bee111","Type":"ContainerDied","Data":"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0"} Apr 17 15:21:12.827870 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827480 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d56b4868b-5xnp7" event={"ID":"19d88c69-b00c-4c2f-9518-80fcc8bee111","Type":"ContainerDied","Data":"02c2d77777c3ae57700d98cadfe1620c1db7c4871d2b41c2fe37fa600df54402"} Apr 17 15:21:12.827870 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827493 2567 scope.go:117] "RemoveContainer" containerID="e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0" Apr 17 15:21:12.827870 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.827497 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d56b4868b-5xnp7" Apr 17 15:21:12.840786 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.840767 2567 scope.go:117] "RemoveContainer" containerID="e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0" Apr 17 15:21:12.841051 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:12.841032 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0\": container with ID starting with e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0 not found: ID does not exist" containerID="e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0" Apr 17 15:21:12.841099 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.841059 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0"} err="failed to get container status \"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0\": rpc error: code = NotFound desc = could not find container \"e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0\": container with ID starting with e7d65cc75d36346f575e5157f0382dd443b9a151e5cf3710ff3da72d73008ed0 not found: ID does not exist" Apr 17 15:21:12.850626 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.850604 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:21:12.856230 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:12.856207 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d56b4868b-5xnp7"] Apr 17 15:21:13.112992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.112914 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d88c69-b00c-4c2f-9518-80fcc8bee111" path="/var/lib/kubelet/pods/19d88c69-b00c-4c2f-9518-80fcc8bee111/volumes" Apr 17 15:21:13.966027 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.965994 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:21:13.966437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966373 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerName="registry" Apr 17 15:21:13.966437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966387 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerName="registry" Apr 17 15:21:13.966437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966413 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19d88c69-b00c-4c2f-9518-80fcc8bee111" containerName="console" Apr 17 15:21:13.966437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966419 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d88c69-b00c-4c2f-9518-80fcc8bee111" containerName="console" Apr 17 15:21:13.966599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966554 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8571a62a-9563-470f-aa7c-a31197ec34fd" containerName="registry" Apr 17 15:21:13.966599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.966567 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="19d88c69-b00c-4c2f-9518-80fcc8bee111" containerName="console" Apr 17 15:21:13.971347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.971321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:13.973874 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.973843 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w8shx\"" Apr 17 15:21:13.973874 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.973847 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 15:21:13.974108 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.973848 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 15:21:13.974724 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.974700 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 15:21:13.974849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.974814 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 15:21:13.974849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.974841 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 15:21:13.974958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.974922 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 15:21:13.975201 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.975185 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 15:21:13.980674 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.980650 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:21:13.980796 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:13.980734 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 15:21:14.086606 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086574 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086606 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7pk9\" (UniqueName: \"kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.086952 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.086819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187168 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187168 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187441 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7pk9\" (UniqueName: \"kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.187962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.187939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.188071 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.188050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.188137 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.188064 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.188192 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.188152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.189660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.189636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.189793 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.189776 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.194672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.194655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7pk9\" (UniqueName: \"kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9\") pod \"console-f8d6cbd49-hvvm6\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.282765 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.282694 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:14.398005 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.397982 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:21:14.400434 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:21:14.400409 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e80a9a_382c_4794_a077_7b9b4a747f03.slice/crio-5521f81d54240439076a0c0546bb9f27ae7d940300605ae38d050d08c153e266 WatchSource:0}: Error finding container 5521f81d54240439076a0c0546bb9f27ae7d940300605ae38d050d08c153e266: Status 404 returned error can't find the container with id 5521f81d54240439076a0c0546bb9f27ae7d940300605ae38d050d08c153e266 Apr 17 15:21:14.761958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.761925 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:14.762389 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762344 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="alertmanager" containerID="cri-o://333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" gracePeriod=120 Apr 17 15:21:14.762548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762404 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="config-reloader" containerID="cri-o://2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" gracePeriod=120 Apr 17 15:21:14.762548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762390 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-web" containerID="cri-o://ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" gracePeriod=120 Apr 17 15:21:14.762548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762433 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy" containerID="cri-o://b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" gracePeriod=120 Apr 17 15:21:14.762548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762393 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-metric" containerID="cri-o://c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" gracePeriod=120 Apr 17 15:21:14.762548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.762398 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="prom-label-proxy" containerID="cri-o://e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" gracePeriod=120 Apr 17 15:21:14.834517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.834485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f8d6cbd49-hvvm6" event={"ID":"c9e80a9a-382c-4794-a077-7b9b4a747f03","Type":"ContainerStarted","Data":"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867"} Apr 17 15:21:14.834517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.834521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f8d6cbd49-hvvm6" event={"ID":"c9e80a9a-382c-4794-a077-7b9b4a747f03","Type":"ContainerStarted","Data":"5521f81d54240439076a0c0546bb9f27ae7d940300605ae38d050d08c153e266"} Apr 17 15:21:14.850023 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:14.849980 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f8d6cbd49-hvvm6" podStartSLOduration=1.849968161 podStartE2EDuration="1.849968161s" podCreationTimestamp="2026-04-17 15:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:21:14.84929724 +0000 UTC m=+248.240839287" watchObservedRunningTime="2026-04-17 15:21:14.849968161 +0000 UTC m=+248.241510197" Apr 17 15:21:15.840649 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840610 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" exitCode=0 Apr 17 15:21:15.840649 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840634 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" exitCode=0 Apr 17 15:21:15.840649 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840643 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" exitCode=0 Apr 17 15:21:15.840649 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840650 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" exitCode=0 Apr 17 15:21:15.841236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb"} Apr 17 15:21:15.841236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840710 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361"} Apr 17 15:21:15.841236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840720 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87"} Apr 17 15:21:15.841236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.840729 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22"} Apr 17 15:21:15.999103 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:15.999079 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:16.002248 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002232 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002268 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002285 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002388 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002305 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002388 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002351 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dq68\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002388 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002375 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002419 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002458 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002488 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002665 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002520 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002665 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002545 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002665 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002588 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002665 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002612 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy\") pod \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\" (UID: \"e6f3d475-a740-4879-84a4-0bbd9dda11e5\") " Apr 17 15:21:16.002866 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.002821 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:16.004612 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.003222 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:16.004822 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.004781 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:21:16.005772 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.005738 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.005878 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.005856 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.006075 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.006055 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:16.006348 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.006302 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.007510 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.007476 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.008641 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.008576 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out" (OuterVolumeSpecName: "config-out") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:21:16.008858 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.008816 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68" (OuterVolumeSpecName: "kube-api-access-8dq68") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "kube-api-access-8dq68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:16.009037 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.009014 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.013186 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.013164 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.018981 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.018956 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config" (OuterVolumeSpecName: "web-config") pod "e6f3d475-a740-4879-84a4-0bbd9dda11e5" (UID: "e6f3d475-a740-4879-84a4-0bbd9dda11e5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104104 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104130 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-alertmanager-main-db\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104141 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-out\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104150 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-tls-assets\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104164 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104177 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-web-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104186 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104196 2567 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-cluster-tls-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104205 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104214 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-secret-alertmanager-main-tls\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104223 2567 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e6f3d475-a740-4879-84a4-0bbd9dda11e5-config-volume\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104231 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dq68\" (UniqueName: \"kubernetes.io/projected/e6f3d475-a740-4879-84a4-0bbd9dda11e5-kube-api-access-8dq68\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.104553 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.104243 2567 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f3d475-a740-4879-84a4-0bbd9dda11e5-metrics-client-ca\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:21:16.846009 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.845977 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" exitCode=0 Apr 17 15:21:16.846009 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846002 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerID="ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" exitCode=0 Apr 17 15:21:16.846453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846050 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da"} Apr 17 15:21:16.846453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846087 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7"} Apr 17 15:21:16.846453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846097 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:16.846453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846112 2567 scope.go:117] "RemoveContainer" containerID="e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" Apr 17 15:21:16.846453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.846102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e6f3d475-a740-4879-84a4-0bbd9dda11e5","Type":"ContainerDied","Data":"24c1b22bbbbed99ed1d49c154e0c950a0bc8db19c2336d254a0fdbd453128a9d"} Apr 17 15:21:16.856217 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.856192 2567 scope.go:117] "RemoveContainer" containerID="c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" Apr 17 15:21:16.862956 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.862939 2567 scope.go:117] "RemoveContainer" containerID="b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" Apr 17 15:21:16.869104 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.869090 2567 scope.go:117] "RemoveContainer" containerID="ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" Apr 17 15:21:16.871133 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.871110 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:16.876038 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.876020 2567 scope.go:117] "RemoveContainer" containerID="2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" Apr 17 15:21:16.877005 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.876986 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:16.882297 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.882140 2567 scope.go:117] "RemoveContainer" containerID="333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" Apr 17 15:21:16.888199 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.888181 2567 scope.go:117] "RemoveContainer" containerID="c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1" Apr 17 15:21:16.893998 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.893983 2567 scope.go:117] "RemoveContainer" containerID="e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" Apr 17 15:21:16.894258 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.894231 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb\": container with ID starting with e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb not found: ID does not exist" containerID="e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" Apr 17 15:21:16.894303 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894266 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb"} err="failed to get container status \"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb\": rpc error: code = NotFound desc = could not find container \"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb\": container with ID starting with e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb not found: ID does not exist" Apr 17 15:21:16.894303 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894285 2567 scope.go:117] "RemoveContainer" containerID="c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" Apr 17 15:21:16.894536 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.894517 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da\": container with ID starting with c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da not found: ID does not exist" containerID="c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" Apr 17 15:21:16.894581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894545 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da"} err="failed to get container status \"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da\": rpc error: code = NotFound desc = could not find container \"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da\": container with ID starting with c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da not found: ID does not exist" Apr 17 15:21:16.894581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894564 2567 scope.go:117] "RemoveContainer" containerID="b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" Apr 17 15:21:16.894783 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.894766 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361\": container with ID starting with b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361 not found: ID does not exist" containerID="b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" Apr 17 15:21:16.894847 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894791 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361"} err="failed to get container status \"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361\": rpc error: code = NotFound desc = could not find container \"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361\": container with ID starting with b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361 not found: ID does not exist" Apr 17 15:21:16.894847 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.894814 2567 scope.go:117] "RemoveContainer" containerID="ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" Apr 17 15:21:16.895044 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.895028 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7\": container with ID starting with ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7 not found: ID does not exist" containerID="ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" Apr 17 15:21:16.895081 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895048 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7"} err="failed to get container status \"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7\": rpc error: code = NotFound desc = could not find container \"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7\": container with ID starting with ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7 not found: ID does not exist" Apr 17 15:21:16.895081 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895061 2567 scope.go:117] "RemoveContainer" containerID="2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" Apr 17 15:21:16.895241 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.895228 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87\": container with ID starting with 2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87 not found: ID does not exist" containerID="2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" Apr 17 15:21:16.895286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895244 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87"} err="failed to get container status \"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87\": rpc error: code = NotFound desc = could not find container \"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87\": container with ID starting with 2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87 not found: ID does not exist" Apr 17 15:21:16.895286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895255 2567 scope.go:117] "RemoveContainer" containerID="333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" Apr 17 15:21:16.895438 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.895420 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22\": container with ID starting with 333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22 not found: ID does not exist" containerID="333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" Apr 17 15:21:16.895476 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895444 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22"} err="failed to get container status \"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22\": rpc error: code = NotFound desc = could not find container \"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22\": container with ID starting with 333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22 not found: ID does not exist" Apr 17 15:21:16.895476 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895456 2567 scope.go:117] "RemoveContainer" containerID="c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1" Apr 17 15:21:16.895635 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:21:16.895622 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1\": container with ID starting with c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1 not found: ID does not exist" containerID="c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1" Apr 17 15:21:16.895671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895637 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1"} err="failed to get container status \"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1\": rpc error: code = NotFound desc = could not find container \"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1\": container with ID starting with c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1 not found: ID does not exist" Apr 17 15:21:16.895671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895648 2567 scope.go:117] "RemoveContainer" containerID="e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb" Apr 17 15:21:16.895861 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895839 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb"} err="failed to get container status \"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb\": rpc error: code = NotFound desc = could not find container \"e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb\": container with ID starting with e239109e81b050eb06a3292008278417ad0ae4d8839b4a7e5ad2100077421bfb not found: ID does not exist" Apr 17 15:21:16.895902 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.895864 2567 scope.go:117] "RemoveContainer" containerID="c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da" Apr 17 15:21:16.896084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896066 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da"} err="failed to get container status \"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da\": rpc error: code = NotFound desc = could not find container \"c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da\": container with ID starting with c95d8669838dd7873cad91fc21fa9efaa48bab9c8afb673d86359e9a73bec3da not found: ID does not exist" Apr 17 15:21:16.896124 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896084 2567 scope.go:117] "RemoveContainer" containerID="b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361" Apr 17 15:21:16.896305 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896288 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361"} err="failed to get container status \"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361\": rpc error: code = NotFound desc = could not find container \"b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361\": container with ID starting with b7795ffd6fc8b631d3ede6644673e1c48684b5685f83902e1c7f37761b6a4361 not found: ID does not exist" Apr 17 15:21:16.896305 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896305 2567 scope.go:117] "RemoveContainer" containerID="ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7" Apr 17 15:21:16.896516 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896500 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7"} err="failed to get container status \"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7\": rpc error: code = NotFound desc = could not find container \"ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7\": container with ID starting with ca89d367cbd710ab61367af41ed32ef499a84555fb1f0538880a97d68f454fc7 not found: ID does not exist" Apr 17 15:21:16.896560 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896516 2567 scope.go:117] "RemoveContainer" containerID="2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87" Apr 17 15:21:16.896712 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896696 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87"} err="failed to get container status \"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87\": rpc error: code = NotFound desc = could not find container \"2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87\": container with ID starting with 2ce3985ed3b31022dd529860225cf5f7f7452659ecffa66266680261f9716e87 not found: ID does not exist" Apr 17 15:21:16.896750 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896713 2567 scope.go:117] "RemoveContainer" containerID="333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22" Apr 17 15:21:16.896907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896892 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22"} err="failed to get container status \"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22\": rpc error: code = NotFound desc = could not find container \"333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22\": container with ID starting with 333b453c0b71de01c538553f24e5f0b37eac4bc2525f26c3b7f55e1f723c1b22 not found: ID does not exist" Apr 17 15:21:16.896907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.896906 2567 scope.go:117] "RemoveContainer" containerID="c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1" Apr 17 15:21:16.897086 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.897071 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1"} err="failed to get container status \"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1\": rpc error: code = NotFound desc = could not find container \"c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1\": container with ID starting with c790820a923b0d5ad883d1c1d0ed6212a871963eff88a84fe358e36b00d9a1d1 not found: ID does not exist" Apr 17 15:21:16.901336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901298 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:16.901615 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901602 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="init-config-reloader" Apr 17 15:21:16.901659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901618 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="init-config-reloader" Apr 17 15:21:16.901659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901627 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="alertmanager" Apr 17 15:21:16.901659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901635 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="alertmanager" Apr 17 15:21:16.901659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901652 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="prom-label-proxy" Apr 17 15:21:16.901659 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901658 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="prom-label-proxy" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901664 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="config-reloader" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901669 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="config-reloader" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901675 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-web" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901679 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-web" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901686 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-metric" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901692 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-metric" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901698 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901703 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901758 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-metric" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901766 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="prom-label-proxy" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901773 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="alertmanager" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901779 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901787 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="kube-rbac-proxy-web" Apr 17 15:21:16.901804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.901797 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" containerName="config-reloader" Apr 17 15:21:16.906805 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.906789 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:16.910124 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.910097 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 15:21:16.910216 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.910140 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 15:21:16.910216 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.910179 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 15:21:16.910669 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.910653 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 15:21:16.911078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.911065 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 15:21:16.911443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.911428 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 15:21:16.911541 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.911452 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-r4jt7\"" Apr 17 15:21:16.915497 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.915479 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 15:21:16.915682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.915668 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 15:21:16.926422 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.926401 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:16.941865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:16.941841 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 15:21:17.011501 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-web-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011501 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011570 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86wn\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-kube-api-access-k86wn\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011691 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011754 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-out\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.011892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.011834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.112458 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.112383 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f3d475-a740-4879-84a4-0bbd9dda11e5" path="/var/lib/kubelet/pods/e6f3d475-a740-4879-84a4-0bbd9dda11e5/volumes" Apr 17 15:21:17.112959 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.112936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113005 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.112979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113085 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113118 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-out\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-web-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k86wn\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-kube-api-access-k86wn\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113333 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113356 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.113409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.113380 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.114097 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.114068 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.114325 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.114249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.114827 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.114803 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5c3978-c304-4ce4-a24f-e298635f0b6c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.116198 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.116179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.116660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.116628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.116766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.116744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.116766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.116747 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-web-config\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.117024 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.117002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-out\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.117114 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.117095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.117179 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.117131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.117406 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.117385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.117799 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.117784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6f5c3978-c304-4ce4-a24f-e298635f0b6c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.121797 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.121775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86wn\" (UniqueName: \"kubernetes.io/projected/6f5c3978-c304-4ce4-a24f-e298635f0b6c-kube-api-access-k86wn\") pod \"alertmanager-main-0\" (UID: \"6f5c3978-c304-4ce4-a24f-e298635f0b6c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.215598 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.215551 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 15:21:17.335940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.335916 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 15:21:17.338396 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:21:17.338370 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5c3978_c304_4ce4_a24f_e298635f0b6c.slice/crio-4eb67f88536a369ed4cc4da1dee7125fd6cc55cd48fdfeaa06cfd608a42f8ca4 WatchSource:0}: Error finding container 4eb67f88536a369ed4cc4da1dee7125fd6cc55cd48fdfeaa06cfd608a42f8ca4: Status 404 returned error can't find the container with id 4eb67f88536a369ed4cc4da1dee7125fd6cc55cd48fdfeaa06cfd608a42f8ca4 Apr 17 15:21:17.850562 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.850530 2567 generic.go:358] "Generic (PLEG): container finished" podID="6f5c3978-c304-4ce4-a24f-e298635f0b6c" containerID="227a95a27c02d05f31fe5fc3ac86e9d2ce217c7f5ad5451e6f0b67df29755211" exitCode=0 Apr 17 15:21:17.850914 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.850582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerDied","Data":"227a95a27c02d05f31fe5fc3ac86e9d2ce217c7f5ad5451e6f0b67df29755211"} Apr 17 15:21:17.850914 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:17.850600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"4eb67f88536a369ed4cc4da1dee7125fd6cc55cd48fdfeaa06cfd608a42f8ca4"} Apr 17 15:21:18.856999 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.856963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"411e4754d6d231f9beff75e0fcd87006de45603cc59d686e188591b83503c05b"} Apr 17 15:21:18.856999 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.857000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"bd97aaa71d7450543cb03d4f52a290d6102e23de59148f91c8a3883883f64494"} Apr 17 15:21:18.857423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.857009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"2b4126f2d0065262aecb0a37502e378ae23bd45ad41ca1e903edf548f94eeb33"} Apr 17 15:21:18.857423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.857018 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"5d04ec7b16082742e6617b6191c8d812cce7a609bec9c932624749cc05062102"} Apr 17 15:21:18.857423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.857026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"d932288042122d5343ea3b311c2b260d100ee56b0796deb30b6bd63e26a02651"} Apr 17 15:21:18.857423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.857036 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6f5c3978-c304-4ce4-a24f-e298635f0b6c","Type":"ContainerStarted","Data":"0501e898608e885fc6c46931ee0a6a24221c41fff9fe91d84f38239a4ad23028"} Apr 17 15:21:18.882063 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:18.882017 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.882002265 podStartE2EDuration="2.882002265s" podCreationTimestamp="2026-04-17 15:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:21:18.880538043 +0000 UTC m=+252.272080078" watchObservedRunningTime="2026-04-17 15:21:18.882002265 +0000 UTC m=+252.273544301" Apr 17 15:21:19.030757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.030722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:21:19.032962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.032936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48990b07-a036-41ef-a6cd-89d7520c417c-metrics-certs\") pod \"network-metrics-daemon-n8fjz\" (UID: \"48990b07-a036-41ef-a6cd-89d7520c417c\") " pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:21:19.316203 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.316171 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:21:19.324367 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.324347 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8fjz" Apr 17 15:21:19.438352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.438325 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n8fjz"] Apr 17 15:21:19.440695 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:21:19.440669 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48990b07_a036_41ef_a6cd_89d7520c417c.slice/crio-601d783632b2f1ee74b0258460424e88b80c8efb5fae8fe203c647979112f56c WatchSource:0}: Error finding container 601d783632b2f1ee74b0258460424e88b80c8efb5fae8fe203c647979112f56c: Status 404 returned error can't find the container with id 601d783632b2f1ee74b0258460424e88b80c8efb5fae8fe203c647979112f56c Apr 17 15:21:19.860685 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:19.860650 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8fjz" event={"ID":"48990b07-a036-41ef-a6cd-89d7520c417c","Type":"ContainerStarted","Data":"601d783632b2f1ee74b0258460424e88b80c8efb5fae8fe203c647979112f56c"} Apr 17 15:21:21.876227 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:21.876187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8fjz" event={"ID":"48990b07-a036-41ef-a6cd-89d7520c417c","Type":"ContainerStarted","Data":"73664e0a2dd9a68257c0e43091eabba62d9fd3428682098ed1ab821ef68ff26a"} Apr 17 15:21:21.876227 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:21.876233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8fjz" event={"ID":"48990b07-a036-41ef-a6cd-89d7520c417c","Type":"ContainerStarted","Data":"81ebd35916da7445c4343a123a96cdb5c00d5e4a51a7ca34a34dea692c512221"} Apr 17 15:21:21.891502 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:21.891451 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n8fjz" podStartSLOduration=253.497274523 podStartE2EDuration="4m14.891436201s" podCreationTimestamp="2026-04-17 15:17:07 +0000 UTC" firstStartedPulling="2026-04-17 15:21:19.44248909 +0000 UTC m=+252.834031104" lastFinishedPulling="2026-04-17 15:21:20.836650758 +0000 UTC m=+254.228192782" observedRunningTime="2026-04-17 15:21:21.89009482 +0000 UTC m=+255.281636856" watchObservedRunningTime="2026-04-17 15:21:21.891436201 +0000 UTC m=+255.282978237" Apr 17 15:21:24.283078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:24.283035 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:24.283078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:24.283085 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:24.288037 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:24.288019 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:21:24.889808 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:21:24.889779 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:22:07.003781 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:07.003753 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 15:22:21.445513 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.445483 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv"] Apr 17 15:22:21.448826 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.448812 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.451686 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.451660 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:22:21.451805 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.451660 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:22:21.452713 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.452692 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:22:21.457806 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.457786 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv"] Apr 17 15:22:21.508937 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.508912 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhp5\" (UniqueName: \"kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.509074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.508962 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.509074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.509000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.609909 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.609882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhp5\" (UniqueName: \"kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.610077 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.609934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.610077 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.609971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.610337 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.610292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.610410 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.610348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.618698 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.618679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhp5\" (UniqueName: \"kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.758406 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.758337 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:21.874853 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.874824 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv"] Apr 17 15:22:21.877862 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:22:21.877838 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f1ca25_3f44_413d_8f1c_c654f6b1d4b2.slice/crio-0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535 WatchSource:0}: Error finding container 0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535: Status 404 returned error can't find the container with id 0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535 Apr 17 15:22:21.879682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:21.879666 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:22:22.050555 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:22.050474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" event={"ID":"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2","Type":"ContainerStarted","Data":"0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535"} Apr 17 15:22:30.081320 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:30.081284 2567 generic.go:358] "Generic (PLEG): container finished" podID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerID="e9aedb55068d2b2fcfc1328ac691f87c00a75e89a52de0152e86031f71aa9a8a" exitCode=0 Apr 17 15:22:30.081794 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:30.081348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" event={"ID":"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2","Type":"ContainerDied","Data":"e9aedb55068d2b2fcfc1328ac691f87c00a75e89a52de0152e86031f71aa9a8a"} Apr 17 15:22:33.093707 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:33.093675 2567 generic.go:358] "Generic (PLEG): container finished" podID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerID="11dbacf8eb2648d9552cbd758f1acce9c4a2c4bf062ad3c7e07405f76d669a65" exitCode=0 Apr 17 15:22:33.093707 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:33.093709 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" event={"ID":"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2","Type":"ContainerDied","Data":"11dbacf8eb2648d9552cbd758f1acce9c4a2c4bf062ad3c7e07405f76d669a65"} Apr 17 15:22:44.127172 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:44.127137 2567 generic.go:358] "Generic (PLEG): container finished" podID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerID="c02b0e8d8bc4b8dbb4e7882b494b6d7e4fb849dde02a0ada52b1a59f7a0291f3" exitCode=0 Apr 17 15:22:44.127625 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:44.127223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" event={"ID":"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2","Type":"ContainerDied","Data":"c02b0e8d8bc4b8dbb4e7882b494b6d7e4fb849dde02a0ada52b1a59f7a0291f3"} Apr 17 15:22:45.246086 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.246064 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:45.307074 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.307041 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle\") pod \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " Apr 17 15:22:45.307231 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.307099 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util\") pod \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " Apr 17 15:22:45.307231 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.307165 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhp5\" (UniqueName: \"kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5\") pod \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\" (UID: \"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2\") " Apr 17 15:22:45.307657 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.307623 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle" (OuterVolumeSpecName: "bundle") pod "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" (UID: "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:22:45.309232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.309212 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5" (OuterVolumeSpecName: "kube-api-access-qjhp5") pod "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" (UID: "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2"). InnerVolumeSpecName "kube-api-access-qjhp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:22:45.312945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.312919 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util" (OuterVolumeSpecName: "util") pod "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" (UID: "b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:22:45.408041 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.407951 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:22:45.408041 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.407985 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:22:45.408041 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:45.407995 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qjhp5\" (UniqueName: \"kubernetes.io/projected/b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2-kube-api-access-qjhp5\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:22:46.133420 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:46.133390 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" Apr 17 15:22:46.133581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:46.133388 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5d7pbv" event={"ID":"b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2","Type":"ContainerDied","Data":"0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535"} Apr 17 15:22:46.133581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:46.133502 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f535fdc588713d5ea059ac483e94692745114ce207ef82dc806c566f6d7e535" Apr 17 15:22:48.769697 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.769663 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt"] Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.769968 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="pull" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.769978 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="pull" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.769996 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="util" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.770001 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="util" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.770010 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="extract" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.770016 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="extract" Apr 17 15:22:48.770128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.770058 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5f1ca25-3f44-413d-8f1c-c654f6b1d4b2" containerName="extract" Apr 17 15:22:48.773982 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.773966 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.776447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.776426 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 15:22:48.776552 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.776458 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:22:48.776552 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.776475 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-9tskp\"" Apr 17 15:22:48.783221 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.783201 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt"] Apr 17 15:22:48.837070 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.837041 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.837208 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.837083 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zlk\" (UniqueName: \"kubernetes.io/projected/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-kube-api-access-m8zlk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.938478 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.938451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.938600 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.938490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zlk\" (UniqueName: \"kubernetes.io/projected/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-kube-api-access-m8zlk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.938827 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.938807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:48.946500 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:48.946474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zlk\" (UniqueName: \"kubernetes.io/projected/2967e78e-ccd3-43c4-9283-2cd2c50c1dd6-kube-api-access-m8zlk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r8dgt\" (UID: \"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:49.082985 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:49.082884 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" Apr 17 15:22:49.203661 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:49.203637 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt"] Apr 17 15:22:49.206492 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:22:49.206462 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2967e78e_ccd3_43c4_9283_2cd2c50c1dd6.slice/crio-70d5b818f52ca998fd5bdc70b06189311be7a26f687bd9614c04a0f6bad50546 WatchSource:0}: Error finding container 70d5b818f52ca998fd5bdc70b06189311be7a26f687bd9614c04a0f6bad50546: Status 404 returned error can't find the container with id 70d5b818f52ca998fd5bdc70b06189311be7a26f687bd9614c04a0f6bad50546 Apr 17 15:22:50.147221 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:50.147185 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" event={"ID":"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6","Type":"ContainerStarted","Data":"70d5b818f52ca998fd5bdc70b06189311be7a26f687bd9614c04a0f6bad50546"} Apr 17 15:22:52.155306 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:52.155267 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" event={"ID":"2967e78e-ccd3-43c4-9283-2cd2c50c1dd6","Type":"ContainerStarted","Data":"37015f25e0494d41fe4176ed5bc32ed7b53711c8a6fb808d79c803f54fb7f6c3"} Apr 17 15:22:52.177580 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:52.177525 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r8dgt" podStartSLOduration=2.094203727 podStartE2EDuration="4.177511894s" podCreationTimestamp="2026-04-17 15:22:48 +0000 UTC" firstStartedPulling="2026-04-17 15:22:49.208793823 +0000 UTC m=+342.600335837" lastFinishedPulling="2026-04-17 15:22:51.29210199 +0000 UTC m=+344.683644004" observedRunningTime="2026-04-17 15:22:52.177326645 +0000 UTC m=+345.568868675" watchObservedRunningTime="2026-04-17 15:22:52.177511894 +0000 UTC m=+345.569053928" Apr 17 15:22:53.651001 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.650964 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg"] Apr 17 15:22:53.654418 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.654401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.656588 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.656566 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:22:53.656708 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.656566 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:22:53.657696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.657681 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:22:53.661214 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.661191 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg"] Apr 17 15:22:53.779785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.779757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.779927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.779797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.779927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.779827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5d9x\" (UniqueName: \"kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.881018 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.880982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.881178 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.881036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5d9x\" (UniqueName: \"kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.881178 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.881090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.881392 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.881372 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.881472 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.881453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.888962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.888942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5d9x\" (UniqueName: \"kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:53.964398 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:53.964299 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:22:54.083145 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:54.083120 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg"] Apr 17 15:22:54.085594 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:22:54.085566 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5b8999_9641_4245_b5ea_8176499c000f.slice/crio-236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07 WatchSource:0}: Error finding container 236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07: Status 404 returned error can't find the container with id 236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07 Apr 17 15:22:54.167673 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:54.167638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerStarted","Data":"670bca01438646601591a4a3b1306bba93a1b536ed3d8f409dff0eabc1460ca1"} Apr 17 15:22:54.167673 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:54.167677 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerStarted","Data":"236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07"} Apr 17 15:22:55.172054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:55.171974 2567 generic.go:358] "Generic (PLEG): container finished" podID="9f5b8999-9641-4245-b5ea-8176499c000f" containerID="670bca01438646601591a4a3b1306bba93a1b536ed3d8f409dff0eabc1460ca1" exitCode=0 Apr 17 15:22:55.172054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:55.172038 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerDied","Data":"670bca01438646601591a4a3b1306bba93a1b536ed3d8f409dff0eabc1460ca1"} Apr 17 15:22:56.867902 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.867814 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2nzsx"] Apr 17 15:22:56.871280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.871257 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:56.873654 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.873585 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 15:22:56.874736 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.874717 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-rqvjq\"" Apr 17 15:22:56.874849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.874741 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 15:22:56.877548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:56.877528 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2nzsx"] Apr 17 15:22:57.011521 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.011489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78sg\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-kube-api-access-f78sg\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.011684 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.011562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.117079 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.116927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f78sg\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-kube-api-access-f78sg\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.117278 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.117110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.125172 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.125103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.125333 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.125237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78sg\" (UniqueName: \"kubernetes.io/projected/2485af37-ac6a-4324-8829-54774f2a2d42-kube-api-access-f78sg\") pod \"cert-manager-webhook-597b96b99b-2nzsx\" (UID: \"2485af37-ac6a-4324-8829-54774f2a2d42\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.192217 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.192183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:22:57.841181 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:57.841156 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2nzsx"] Apr 17 15:22:57.842524 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:22:57.842501 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2485af37_ac6a_4324_8829_54774f2a2d42.slice/crio-e7bc9860ebffe59fba8f79e50da8e735ef868cc3d62e3a97bf4bc8a6a6fc51cd WatchSource:0}: Error finding container e7bc9860ebffe59fba8f79e50da8e735ef868cc3d62e3a97bf4bc8a6a6fc51cd: Status 404 returned error can't find the container with id e7bc9860ebffe59fba8f79e50da8e735ef868cc3d62e3a97bf4bc8a6a6fc51cd Apr 17 15:22:58.182249 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:58.182211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" event={"ID":"2485af37-ac6a-4324-8829-54774f2a2d42","Type":"ContainerStarted","Data":"e7bc9860ebffe59fba8f79e50da8e735ef868cc3d62e3a97bf4bc8a6a6fc51cd"} Apr 17 15:22:58.183739 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:58.183715 2567 generic.go:358] "Generic (PLEG): container finished" podID="9f5b8999-9641-4245-b5ea-8176499c000f" containerID="b24f6bfa09bd392ae34bbcc9018b779ba9b58eddf757477a50569964ba25c78c" exitCode=0 Apr 17 15:22:58.183845 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:58.183794 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerDied","Data":"b24f6bfa09bd392ae34bbcc9018b779ba9b58eddf757477a50569964ba25c78c"} Apr 17 15:22:59.190094 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:59.189979 2567 generic.go:358] "Generic (PLEG): container finished" podID="9f5b8999-9641-4245-b5ea-8176499c000f" containerID="454ae6e160c97f04d21e77b0351052caa9f6784775aa0528b54480354489cffa" exitCode=0 Apr 17 15:22:59.190094 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:22:59.190030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerDied","Data":"454ae6e160c97f04d21e77b0351052caa9f6784775aa0528b54480354489cffa"} Apr 17 15:23:01.101117 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.101095 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:23:01.198877 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.198852 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" event={"ID":"9f5b8999-9641-4245-b5ea-8176499c000f","Type":"ContainerDied","Data":"236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07"} Apr 17 15:23:01.198877 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.198874 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffr6rg" Apr 17 15:23:01.199019 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.198883 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="236e9bd28ccc12945f221c9ae582701a2d6520b99fe61d64e169d88237852b07" Apr 17 15:23:01.253282 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.253256 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util\") pod \"9f5b8999-9641-4245-b5ea-8176499c000f\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " Apr 17 15:23:01.253463 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.253359 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle\") pod \"9f5b8999-9641-4245-b5ea-8176499c000f\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " Apr 17 15:23:01.253463 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.253393 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5d9x\" (UniqueName: \"kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x\") pod \"9f5b8999-9641-4245-b5ea-8176499c000f\" (UID: \"9f5b8999-9641-4245-b5ea-8176499c000f\") " Apr 17 15:23:01.253742 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.253717 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle" (OuterVolumeSpecName: "bundle") pod "9f5b8999-9641-4245-b5ea-8176499c000f" (UID: "9f5b8999-9641-4245-b5ea-8176499c000f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:01.255459 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.255433 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x" (OuterVolumeSpecName: "kube-api-access-v5d9x") pod "9f5b8999-9641-4245-b5ea-8176499c000f" (UID: "9f5b8999-9641-4245-b5ea-8176499c000f"). InnerVolumeSpecName "kube-api-access-v5d9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:23:01.258741 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.258719 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util" (OuterVolumeSpecName: "util") pod "9f5b8999-9641-4245-b5ea-8176499c000f" (UID: "9f5b8999-9641-4245-b5ea-8176499c000f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:01.354729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.354651 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:01.354729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.354678 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5d9x\" (UniqueName: \"kubernetes.io/projected/9f5b8999-9641-4245-b5ea-8176499c000f-kube-api-access-v5d9x\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:01.354729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:01.354688 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f5b8999-9641-4245-b5ea-8176499c000f-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:02.202951 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:02.202915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" event={"ID":"2485af37-ac6a-4324-8829-54774f2a2d42","Type":"ContainerStarted","Data":"cd3918f70f0c5700066d094183bf960032228eb9c5df161409e44d73459974a1"} Apr 17 15:23:02.203389 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:02.203016 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:23:02.219731 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:02.219687 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" podStartSLOduration=2.906596723 podStartE2EDuration="6.219674164s" podCreationTimestamp="2026-04-17 15:22:56 +0000 UTC" firstStartedPulling="2026-04-17 15:22:57.846783855 +0000 UTC m=+351.238325869" lastFinishedPulling="2026-04-17 15:23:01.159861286 +0000 UTC m=+354.551403310" observedRunningTime="2026-04-17 15:23:02.217723877 +0000 UTC m=+355.609265911" watchObservedRunningTime="2026-04-17 15:23:02.219674164 +0000 UTC m=+355.611216198" Apr 17 15:23:08.209841 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:08.209808 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2nzsx" Apr 17 15:23:12.526606 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526568 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l"] Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526868 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="pull" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526879 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="pull" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526888 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="extract" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526894 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="extract" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526902 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="util" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526908 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="util" Apr 17 15:23:12.527054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.526959 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f5b8999-9641-4245-b5ea-8176499c000f" containerName="extract" Apr 17 15:23:12.576227 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.576194 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l"] Apr 17 15:23:12.576227 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.576221 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.578721 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.578695 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:23:12.578970 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.578955 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-xr8mq\"" Apr 17 15:23:12.579793 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.579771 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 15:23:12.648263 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.648240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz45w\" (UniqueName: \"kubernetes.io/projected/7da6be51-382d-470e-9b6e-dee16a790c0e-kube-api-access-xz45w\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.648410 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.648280 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da6be51-382d-470e-9b6e-dee16a790c0e-tmp\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.748652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.748621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da6be51-382d-470e-9b6e-dee16a790c0e-tmp\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.748773 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.748713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz45w\" (UniqueName: \"kubernetes.io/projected/7da6be51-382d-470e-9b6e-dee16a790c0e-kube-api-access-xz45w\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.749072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.749047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7da6be51-382d-470e-9b6e-dee16a790c0e-tmp\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.757180 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.757159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz45w\" (UniqueName: \"kubernetes.io/projected/7da6be51-382d-470e-9b6e-dee16a790c0e-kube-api-access-xz45w\") pod \"openshift-lws-operator-bfc7f696d-5p64l\" (UID: \"7da6be51-382d-470e-9b6e-dee16a790c0e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:12.885152 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:12.885126 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" Apr 17 15:23:13.002856 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.002833 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l"] Apr 17 15:23:13.011519 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:13.011446 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7da6be51_382d_470e_9b6e_dee16a790c0e.slice/crio-07c88e293aa8fb35851558f7cfddeffd604b08208258f2e7c48884a623e22e43 WatchSource:0}: Error finding container 07c88e293aa8fb35851558f7cfddeffd604b08208258f2e7c48884a623e22e43: Status 404 returned error can't find the container with id 07c88e293aa8fb35851558f7cfddeffd604b08208258f2e7c48884a623e22e43 Apr 17 15:23:13.201450 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.201370 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-flwzx"] Apr 17 15:23:13.206143 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.206126 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.208595 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.208575 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vj5n4\"" Apr 17 15:23:13.214599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.214578 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-flwzx"] Apr 17 15:23:13.241559 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.241533 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" event={"ID":"7da6be51-382d-470e-9b6e-dee16a790c0e","Type":"ContainerStarted","Data":"07c88e293aa8fb35851558f7cfddeffd604b08208258f2e7c48884a623e22e43"} Apr 17 15:23:13.252905 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.252883 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-bound-sa-token\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.252978 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.252936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd767\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-kube-api-access-sd767\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.353880 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.353848 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-bound-sa-token\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.354050 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.353916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd767\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-kube-api-access-sd767\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.361907 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.361883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-bound-sa-token\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.361988 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.361957 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd767\" (UniqueName: \"kubernetes.io/projected/da97bb4f-de71-40c6-bf2c-f8186387efeb-kube-api-access-sd767\") pod \"cert-manager-759f64656b-flwzx\" (UID: \"da97bb4f-de71-40c6-bf2c-f8186387efeb\") " pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.516727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.516636 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-flwzx" Apr 17 15:23:13.637155 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:13.637129 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-flwzx"] Apr 17 15:23:13.638842 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:13.638812 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda97bb4f_de71_40c6_bf2c_f8186387efeb.slice/crio-18db7e0cc9d5068448b85016be7d2e0c7c986625d8a325a9c03ce4c842c7be0d WatchSource:0}: Error finding container 18db7e0cc9d5068448b85016be7d2e0c7c986625d8a325a9c03ce4c842c7be0d: Status 404 returned error can't find the container with id 18db7e0cc9d5068448b85016be7d2e0c7c986625d8a325a9c03ce4c842c7be0d Apr 17 15:23:14.250677 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:14.250638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-flwzx" event={"ID":"da97bb4f-de71-40c6-bf2c-f8186387efeb","Type":"ContainerStarted","Data":"41332a05674725524c9d0caca0df0d441555aaf315afcf674ec1cd9358d99861"} Apr 17 15:23:14.250677 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:14.250677 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-flwzx" event={"ID":"da97bb4f-de71-40c6-bf2c-f8186387efeb","Type":"ContainerStarted","Data":"18db7e0cc9d5068448b85016be7d2e0c7c986625d8a325a9c03ce4c842c7be0d"} Apr 17 15:23:14.267248 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:14.267197 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-flwzx" podStartSLOduration=1.2671814270000001 podStartE2EDuration="1.267181427s" podCreationTimestamp="2026-04-17 15:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:23:14.265336046 +0000 UTC m=+367.656878079" watchObservedRunningTime="2026-04-17 15:23:14.267181427 +0000 UTC m=+367.658723461" Apr 17 15:23:15.256647 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:15.256615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" event={"ID":"7da6be51-382d-470e-9b6e-dee16a790c0e","Type":"ContainerStarted","Data":"1081dbc28e2646dca73b3def3b10138e3462fd159f137093a420dc54a3d3b82a"} Apr 17 15:23:15.271785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:15.271744 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5p64l" podStartSLOduration=1.41559605 podStartE2EDuration="3.271730475s" podCreationTimestamp="2026-04-17 15:23:12 +0000 UTC" firstStartedPulling="2026-04-17 15:23:13.012545303 +0000 UTC m=+366.404087316" lastFinishedPulling="2026-04-17 15:23:14.86867972 +0000 UTC m=+368.260221741" observedRunningTime="2026-04-17 15:23:15.271106003 +0000 UTC m=+368.662648038" watchObservedRunningTime="2026-04-17 15:23:15.271730475 +0000 UTC m=+368.663272511" Apr 17 15:23:19.046391 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.046360 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb"] Apr 17 15:23:19.049853 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.049836 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.052063 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.052034 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:23:19.052186 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.052099 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:23:19.053297 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.053274 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:23:19.055978 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.055956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb"] Apr 17 15:23:19.095051 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.095022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.095164 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.095078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xgdr\" (UniqueName: \"kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.095164 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.095139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.196382 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.196346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xgdr\" (UniqueName: \"kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.196382 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.196387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.196568 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.196438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.196780 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.196761 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.196815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.196792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.204371 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.204342 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xgdr\" (UniqueName: \"kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.360144 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.360065 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:19.479899 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:19.479876 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb"] Apr 17 15:23:19.481882 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:19.481849 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3b6c1e_7891_41c4_bb83_b1ec093a0ee7.slice/crio-238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1 WatchSource:0}: Error finding container 238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1: Status 404 returned error can't find the container with id 238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1 Apr 17 15:23:20.272325 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:20.272275 2567 generic.go:358] "Generic (PLEG): container finished" podID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerID="363dac16892db7ec8daf0e85eaac6cbf7adbad6fe68eda425910fca10241d71f" exitCode=0 Apr 17 15:23:20.272674 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:20.272373 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" event={"ID":"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7","Type":"ContainerDied","Data":"363dac16892db7ec8daf0e85eaac6cbf7adbad6fe68eda425910fca10241d71f"} Apr 17 15:23:20.272674 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:20.272400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" event={"ID":"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7","Type":"ContainerStarted","Data":"238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1"} Apr 17 15:23:21.278182 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:21.278147 2567 generic.go:358] "Generic (PLEG): container finished" podID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerID="0c4c9b990377179935cb629ef572d30f6fa31e01015530c18b2d272e663ba397" exitCode=0 Apr 17 15:23:21.278637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:21.278239 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" event={"ID":"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7","Type":"ContainerDied","Data":"0c4c9b990377179935cb629ef572d30f6fa31e01015530c18b2d272e663ba397"} Apr 17 15:23:22.288275 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:22.288245 2567 generic.go:358] "Generic (PLEG): container finished" podID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerID="f1d4799c1ef7dd2b5a9825a15e2ea065d7c4e5438e8cfcabaa63c9f171bd8f8a" exitCode=0 Apr 17 15:23:22.288662 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:22.288287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" event={"ID":"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7","Type":"ContainerDied","Data":"f1d4799c1ef7dd2b5a9825a15e2ea065d7c4e5438e8cfcabaa63c9f171bd8f8a"} Apr 17 15:23:23.436244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.436220 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:23.527577 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.527508 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util\") pod \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " Apr 17 15:23:23.527771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.527616 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle\") pod \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " Apr 17 15:23:23.527771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.527693 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xgdr\" (UniqueName: \"kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr\") pod \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\" (UID: \"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7\") " Apr 17 15:23:23.528604 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.528573 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle" (OuterVolumeSpecName: "bundle") pod "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" (UID: "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:23.530072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.530049 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr" (OuterVolumeSpecName: "kube-api-access-2xgdr") pod "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" (UID: "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7"). InnerVolumeSpecName "kube-api-access-2xgdr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:23:23.533683 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.533657 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util" (OuterVolumeSpecName: "util") pod "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" (UID: "2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:23.628626 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.628603 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xgdr\" (UniqueName: \"kubernetes.io/projected/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-kube-api-access-2xgdr\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:23.628626 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.628627 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:23.628784 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:23.628638 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:24.296376 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:24.296335 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" event={"ID":"2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7","Type":"ContainerDied","Data":"238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1"} Apr 17 15:23:24.296376 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:24.296360 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5qb2sb" Apr 17 15:23:24.296376 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:24.296373 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="238dd9aa99063b567173145866d9a061ff3f1ef2074007690ddb7db68f0855a1" Apr 17 15:23:29.089757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.089721 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78"] Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090029 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="util" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090039 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="util" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090050 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="extract" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090056 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="extract" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090062 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="pull" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090067 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="pull" Apr 17 15:23:29.090156 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.090123 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e3b6c1e-7891-41c4-bb83-b1ec093a0ee7" containerName="extract" Apr 17 15:23:29.093089 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.093073 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.095739 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.095713 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:23:29.095869 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.095813 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:23:29.096756 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.096742 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:23:29.100970 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.100949 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78"] Apr 17 15:23:29.174286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.174250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.174440 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.174340 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.174440 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.174366 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ncxj\" (UniqueName: \"kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.275275 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.275237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.275494 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.275286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ncxj\" (UniqueName: \"kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.275494 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.275331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.275713 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.275691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.275713 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.275709 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.283286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.283239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ncxj\" (UniqueName: \"kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.403045 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.403008 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:29.522062 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:29.522038 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78"] Apr 17 15:23:29.524240 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:29.524205 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244656f4_836d_4a3e_8e59_7cf0742a314b.slice/crio-6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06 WatchSource:0}: Error finding container 6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06: Status 404 returned error can't find the container with id 6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06 Apr 17 15:23:30.316695 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.316662 2567 generic.go:358] "Generic (PLEG): container finished" podID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerID="8ee619b6bf7a1c27fddeebebedc424e97b75a6c99b4f9a6fa611ed6caa63dec1" exitCode=0 Apr 17 15:23:30.317008 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.316746 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" event={"ID":"244656f4-836d-4a3e-8e59-7cf0742a314b","Type":"ContainerDied","Data":"8ee619b6bf7a1c27fddeebebedc424e97b75a6c99b4f9a6fa611ed6caa63dec1"} Apr 17 15:23:30.317008 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.316780 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" event={"ID":"244656f4-836d-4a3e-8e59-7cf0742a314b","Type":"ContainerStarted","Data":"6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06"} Apr 17 15:23:30.900533 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.900503 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx"] Apr 17 15:23:30.903959 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.903929 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:30.906676 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.906653 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 15:23:30.907055 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.907031 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 15:23:30.907249 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.907227 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kvnp8\"" Apr 17 15:23:30.907382 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.907285 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 15:23:30.907604 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.907583 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 15:23:30.916560 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.916538 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx"] Apr 17 15:23:30.988986 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.988949 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqzv\" (UniqueName: \"kubernetes.io/projected/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-kube-api-access-vhqzv\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:30.989159 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.989030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:30.989159 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:30.989051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.090023 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.089984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.090204 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.090030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.090204 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.090098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqzv\" (UniqueName: \"kubernetes.io/projected/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-kube-api-access-vhqzv\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.092479 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.092453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.092854 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.092832 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.099918 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.099891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqzv\" (UniqueName: \"kubernetes.io/projected/27b103cb-80f5-4d9f-9bc7-a1812dddb90b-kube-api-access-vhqzv\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-49nqx\" (UID: \"27b103cb-80f5-4d9f-9bc7-a1812dddb90b\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.215403 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.215367 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:31.326021 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.325985 2567 generic.go:358] "Generic (PLEG): container finished" podID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerID="2a6822e0d97788d1a0c15e893531a4a968b62ebac52046b63f705ad4a93460d4" exitCode=0 Apr 17 15:23:31.326390 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.326046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" event={"ID":"244656f4-836d-4a3e-8e59-7cf0742a314b","Type":"ContainerDied","Data":"2a6822e0d97788d1a0c15e893531a4a968b62ebac52046b63f705ad4a93460d4"} Apr 17 15:23:31.349286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:31.349260 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx"] Apr 17 15:23:31.349926 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:31.349904 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b103cb_80f5_4d9f_9bc7_a1812dddb90b.slice/crio-04a1a89e869d1466694c0c67dac6d38802bcc43bb3557a9b1af54aa367de8a25 WatchSource:0}: Error finding container 04a1a89e869d1466694c0c67dac6d38802bcc43bb3557a9b1af54aa367de8a25: Status 404 returned error can't find the container with id 04a1a89e869d1466694c0c67dac6d38802bcc43bb3557a9b1af54aa367de8a25 Apr 17 15:23:32.332397 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:32.332342 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" event={"ID":"27b103cb-80f5-4d9f-9bc7-a1812dddb90b","Type":"ContainerStarted","Data":"04a1a89e869d1466694c0c67dac6d38802bcc43bb3557a9b1af54aa367de8a25"} Apr 17 15:23:32.334650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:32.334622 2567 generic.go:358] "Generic (PLEG): container finished" podID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerID="251ccd2f369af2161ca9731c799c7e746b9cf6072f1272ea0b765f01c58122f9" exitCode=0 Apr 17 15:23:32.334787 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:32.334696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" event={"ID":"244656f4-836d-4a3e-8e59-7cf0742a314b","Type":"ContainerDied","Data":"251ccd2f369af2161ca9731c799c7e746b9cf6072f1272ea0b765f01c58122f9"} Apr 17 15:23:34.374398 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.374376 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:34.420637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.420608 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util\") pod \"244656f4-836d-4a3e-8e59-7cf0742a314b\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " Apr 17 15:23:34.420869 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.420681 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ncxj\" (UniqueName: \"kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj\") pod \"244656f4-836d-4a3e-8e59-7cf0742a314b\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " Apr 17 15:23:34.420869 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.420711 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle\") pod \"244656f4-836d-4a3e-8e59-7cf0742a314b\" (UID: \"244656f4-836d-4a3e-8e59-7cf0742a314b\") " Apr 17 15:23:34.421924 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.421879 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle" (OuterVolumeSpecName: "bundle") pod "244656f4-836d-4a3e-8e59-7cf0742a314b" (UID: "244656f4-836d-4a3e-8e59-7cf0742a314b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:34.423423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.423390 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj" (OuterVolumeSpecName: "kube-api-access-5ncxj") pod "244656f4-836d-4a3e-8e59-7cf0742a314b" (UID: "244656f4-836d-4a3e-8e59-7cf0742a314b"). InnerVolumeSpecName "kube-api-access-5ncxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:23:34.427873 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.427847 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util" (OuterVolumeSpecName: "util") pod "244656f4-836d-4a3e-8e59-7cf0742a314b" (UID: "244656f4-836d-4a3e-8e59-7cf0742a314b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:23:34.521849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.521818 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:34.521849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.521845 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ncxj\" (UniqueName: \"kubernetes.io/projected/244656f4-836d-4a3e-8e59-7cf0742a314b-kube-api-access-5ncxj\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:34.521849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:34.521856 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244656f4-836d-4a3e-8e59-7cf0742a314b-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:23:35.345606 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.345571 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" event={"ID":"27b103cb-80f5-4d9f-9bc7-a1812dddb90b","Type":"ContainerStarted","Data":"dca0aac967fd6466dad787b9543a75cad8062fe8d5ba9bc4d5406668c8743219"} Apr 17 15:23:35.345796 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.345675 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:35.347239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.347218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" event={"ID":"244656f4-836d-4a3e-8e59-7cf0742a314b","Type":"ContainerDied","Data":"6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06"} Apr 17 15:23:35.347239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.347241 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ceca4799688149d43ffc71d50fa0bd84dbcb727dd098ed1878a706d4a483b06" Apr 17 15:23:35.347408 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.347263 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bg78" Apr 17 15:23:35.366247 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:35.366204 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" podStartSLOduration=2.312693887 podStartE2EDuration="5.36619328s" podCreationTimestamp="2026-04-17 15:23:30 +0000 UTC" firstStartedPulling="2026-04-17 15:23:31.3515713 +0000 UTC m=+384.743113313" lastFinishedPulling="2026-04-17 15:23:34.405070678 +0000 UTC m=+387.796612706" observedRunningTime="2026-04-17 15:23:35.364169218 +0000 UTC m=+388.755711253" watchObservedRunningTime="2026-04-17 15:23:35.36619328 +0000 UTC m=+388.757735316" Apr 17 15:23:46.353606 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:46.353576 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-49nqx" Apr 17 15:23:52.356565 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356531 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bzprl"] Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356858 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="extract" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356869 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="extract" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356881 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="util" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356886 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="util" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356895 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="pull" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356901 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="pull" Apr 17 15:23:52.356962 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.356950 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="244656f4-836d-4a3e-8e59-7cf0742a314b" containerName="extract" Apr 17 15:23:52.363023 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.363004 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:52.365390 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.365362 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 15:23:52.365505 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.365460 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-lkd2b\"" Apr 17 15:23:52.369524 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.369503 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bzprl"] Apr 17 15:23:52.467478 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.467441 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xq9r\" (UniqueName: \"kubernetes.io/projected/6cd03577-1ebb-4eff-8aff-97e5177167f0-kube-api-access-2xq9r\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:52.467648 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.467507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:52.568184 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.568147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:52.568394 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.568226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xq9r\" (UniqueName: \"kubernetes.io/projected/6cd03577-1ebb-4eff-8aff-97e5177167f0-kube-api-access-2xq9r\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:52.568394 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:23:52.568289 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 15:23:52.568394 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:23:52.568372 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert podName:6cd03577-1ebb-4eff-8aff-97e5177167f0 nodeName:}" failed. No retries permitted until 2026-04-17 15:23:53.068355553 +0000 UTC m=+406.459897566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert") pod "odh-model-controller-858dbf95b8-bzprl" (UID: "6cd03577-1ebb-4eff-8aff-97e5177167f0") : secret "odh-model-controller-webhook-cert" not found Apr 17 15:23:52.576681 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:52.576658 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xq9r\" (UniqueName: \"kubernetes.io/projected/6cd03577-1ebb-4eff-8aff-97e5177167f0-kube-api-access-2xq9r\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:53.072038 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:53.072001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:53.074469 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:53.074440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cd03577-1ebb-4eff-8aff-97e5177167f0-cert\") pod \"odh-model-controller-858dbf95b8-bzprl\" (UID: \"6cd03577-1ebb-4eff-8aff-97e5177167f0\") " pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:53.274225 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:53.274193 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:53.388080 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:53.388046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bzprl"] Apr 17 15:23:53.391882 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:53.391853 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd03577_1ebb_4eff_8aff_97e5177167f0.slice/crio-8bd2c041f9d3c63bcc0825b8571c4b7caca19d338b105fbc8a81ee94ea61cbac WatchSource:0}: Error finding container 8bd2c041f9d3c63bcc0825b8571c4b7caca19d338b105fbc8a81ee94ea61cbac: Status 404 returned error can't find the container with id 8bd2c041f9d3c63bcc0825b8571c4b7caca19d338b105fbc8a81ee94ea61cbac Apr 17 15:23:53.410280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:53.410244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" event={"ID":"6cd03577-1ebb-4eff-8aff-97e5177167f0","Type":"ContainerStarted","Data":"8bd2c041f9d3c63bcc0825b8571c4b7caca19d338b105fbc8a81ee94ea61cbac"} Apr 17 15:23:56.863595 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:56.863570 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-c66kx"] Apr 17 15:23:56.867081 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:56.867062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:56.869527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:56.869501 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 15:23:56.869631 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:56.869610 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-4qbpj\"" Apr 17 15:23:56.875240 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:56.875217 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-c66kx"] Apr 17 15:23:57.009018 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.008980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7525h\" (UniqueName: \"kubernetes.io/projected/8cb081aa-6157-47cb-8014-89e70208a3d0-kube-api-access-7525h\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.009180 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.009042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cb081aa-6157-47cb-8014-89e70208a3d0-cert\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.109948 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.109851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7525h\" (UniqueName: \"kubernetes.io/projected/8cb081aa-6157-47cb-8014-89e70208a3d0-kube-api-access-7525h\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.109948 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.109914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cb081aa-6157-47cb-8014-89e70208a3d0-cert\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.112538 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.112514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cb081aa-6157-47cb-8014-89e70208a3d0-cert\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.121136 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.121103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7525h\" (UniqueName: \"kubernetes.io/projected/8cb081aa-6157-47cb-8014-89e70208a3d0-kube-api-access-7525h\") pod \"kserve-controller-manager-856948b99f-c66kx\" (UID: \"8cb081aa-6157-47cb-8014-89e70208a3d0\") " pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.201751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.201716 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:23:57.319715 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.319687 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-c66kx"] Apr 17 15:23:57.321729 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:57.321703 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb081aa_6157_47cb_8014_89e70208a3d0.slice/crio-1ede117940010584684074bab1ea3ef091f409ca5f9452508889162e684b9163 WatchSource:0}: Error finding container 1ede117940010584684074bab1ea3ef091f409ca5f9452508889162e684b9163: Status 404 returned error can't find the container with id 1ede117940010584684074bab1ea3ef091f409ca5f9452508889162e684b9163 Apr 17 15:23:57.425582 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.425545 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" event={"ID":"8cb081aa-6157-47cb-8014-89e70208a3d0","Type":"ContainerStarted","Data":"1ede117940010584684074bab1ea3ef091f409ca5f9452508889162e684b9163"} Apr 17 15:23:57.426851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.426820 2567 generic.go:358] "Generic (PLEG): container finished" podID="6cd03577-1ebb-4eff-8aff-97e5177167f0" containerID="86ee290b6f9dee45f89f7e3153895a1047b80f9e2b8015ad3967e8b2753e646a" exitCode=1 Apr 17 15:23:57.426984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.426917 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" event={"ID":"6cd03577-1ebb-4eff-8aff-97e5177167f0","Type":"ContainerDied","Data":"86ee290b6f9dee45f89f7e3153895a1047b80f9e2b8015ad3967e8b2753e646a"} Apr 17 15:23:57.427127 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:57.427103 2567 scope.go:117] "RemoveContainer" containerID="86ee290b6f9dee45f89f7e3153895a1047b80f9e2b8015ad3967e8b2753e646a" Apr 17 15:23:58.432657 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:58.432619 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" event={"ID":"6cd03577-1ebb-4eff-8aff-97e5177167f0","Type":"ContainerStarted","Data":"343fbae8d296d5e3d0024740d7c9308f7abf956692035ca2e75bba7ec832c539"} Apr 17 15:23:58.433084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:58.432704 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:23:58.448493 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:58.448455 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" podStartSLOduration=2.18269346 podStartE2EDuration="6.448443909s" podCreationTimestamp="2026-04-17 15:23:52 +0000 UTC" firstStartedPulling="2026-04-17 15:23:53.393045866 +0000 UTC m=+406.784587879" lastFinishedPulling="2026-04-17 15:23:57.658796316 +0000 UTC m=+411.050338328" observedRunningTime="2026-04-17 15:23:58.446994803 +0000 UTC m=+411.838536852" watchObservedRunningTime="2026-04-17 15:23:58.448443909 +0000 UTC m=+411.839985943" Apr 17 15:23:59.220030 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.219992 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m"] Apr 17 15:23:59.223888 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.223863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.227454 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.227437 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:23:59.228538 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.228514 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:23:59.228638 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.228594 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:23:59.240548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.240525 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m"] Apr 17 15:23:59.331975 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.331942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmtf\" (UniqueName: \"kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.332153 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.332000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.332153 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.332033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.390798 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.390763 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj"] Apr 17 15:23:59.394078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.394058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.396379 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.396356 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 15:23:59.396498 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.396425 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-c95dl\"" Apr 17 15:23:59.396498 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.396368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 15:23:59.404744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.404722 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj"] Apr 17 15:23:59.432504 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.432476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmtf\" (UniqueName: \"kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.432637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.432516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.432637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.432542 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.433015 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.432904 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.433015 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.432928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.440722 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.440701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmtf\" (UniqueName: \"kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.533528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.533452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/709f8d01-1d92-4090-a398-2530cfd1ed0e-tmp\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.533528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.533515 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqw2t\" (UniqueName: \"kubernetes.io/projected/709f8d01-1d92-4090-a398-2530cfd1ed0e-kube-api-access-jqw2t\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.533721 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.533700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/709f8d01-1d92-4090-a398-2530cfd1ed0e-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.534273 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.534258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:23:59.635289 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.635243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/709f8d01-1d92-4090-a398-2530cfd1ed0e-tmp\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.635459 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.635340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqw2t\" (UniqueName: \"kubernetes.io/projected/709f8d01-1d92-4090-a398-2530cfd1ed0e-kube-api-access-jqw2t\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.635459 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.635426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/709f8d01-1d92-4090-a398-2530cfd1ed0e-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.638020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.637995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/709f8d01-1d92-4090-a398-2530cfd1ed0e-tmp\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.638584 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.638560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/709f8d01-1d92-4090-a398-2530cfd1ed0e-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.642988 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.642963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqw2t\" (UniqueName: \"kubernetes.io/projected/709f8d01-1d92-4090-a398-2530cfd1ed0e-kube-api-access-jqw2t\") pod \"kube-auth-proxy-546dd5d8dc-bcsjj\" (UID: \"709f8d01-1d92-4090-a398-2530cfd1ed0e\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.659220 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.659196 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m"] Apr 17 15:23:59.660547 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:59.660527 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18052d78_c4e9_4554_b6c9_ed13890d4a6f.slice/crio-30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc WatchSource:0}: Error finding container 30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc: Status 404 returned error can't find the container with id 30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc Apr 17 15:23:59.705946 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.705919 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" Apr 17 15:23:59.832965 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:23:59.832935 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj"] Apr 17 15:23:59.834900 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:23:59.834871 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709f8d01_1d92_4090_a398_2530cfd1ed0e.slice/crio-4359c55a68d13ec127a33cd5cc4d9b5f0328a2755e8ca21d09dd78d4d0b449b7 WatchSource:0}: Error finding container 4359c55a68d13ec127a33cd5cc4d9b5f0328a2755e8ca21d09dd78d4d0b449b7: Status 404 returned error can't find the container with id 4359c55a68d13ec127a33cd5cc4d9b5f0328a2755e8ca21d09dd78d4d0b449b7 Apr 17 15:24:00.448616 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:00.448565 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" event={"ID":"709f8d01-1d92-4090-a398-2530cfd1ed0e","Type":"ContainerStarted","Data":"4359c55a68d13ec127a33cd5cc4d9b5f0328a2755e8ca21d09dd78d4d0b449b7"} Apr 17 15:24:00.450766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:00.450727 2567 generic.go:358] "Generic (PLEG): container finished" podID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerID="fa28bf1dfef689633c332e36accbef0191c8ef25b8788107e83bfde222c27e64" exitCode=0 Apr 17 15:24:00.450923 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:00.450816 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" event={"ID":"18052d78-c4e9-4554-b6c9-ed13890d4a6f","Type":"ContainerDied","Data":"fa28bf1dfef689633c332e36accbef0191c8ef25b8788107e83bfde222c27e64"} Apr 17 15:24:00.450923 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:00.450885 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" event={"ID":"18052d78-c4e9-4554-b6c9-ed13890d4a6f","Type":"ContainerStarted","Data":"30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc"} Apr 17 15:24:01.456758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:01.456716 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" event={"ID":"8cb081aa-6157-47cb-8014-89e70208a3d0","Type":"ContainerStarted","Data":"547d066c86563984bcfb7ece0ac23255ad81a566a5af2924a9e6a8e52f3ce03b"} Apr 17 15:24:01.457210 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:01.456789 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:24:01.476463 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:01.476401 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" podStartSLOduration=2.278709775 podStartE2EDuration="5.476383302s" podCreationTimestamp="2026-04-17 15:23:56 +0000 UTC" firstStartedPulling="2026-04-17 15:23:57.331209453 +0000 UTC m=+410.722751466" lastFinishedPulling="2026-04-17 15:24:00.52888297 +0000 UTC m=+413.920424993" observedRunningTime="2026-04-17 15:24:01.47452298 +0000 UTC m=+414.866065032" watchObservedRunningTime="2026-04-17 15:24:01.476383302 +0000 UTC m=+414.867925337" Apr 17 15:24:02.461612 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:02.461575 2567 generic.go:358] "Generic (PLEG): container finished" podID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerID="e805892a361dabd0b89ad5c4aa39b485b30a61ecd7ff9bdefb9799e44cab0390" exitCode=0 Apr 17 15:24:02.462087 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:02.461659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" event={"ID":"18052d78-c4e9-4554-b6c9-ed13890d4a6f","Type":"ContainerDied","Data":"e805892a361dabd0b89ad5c4aa39b485b30a61ecd7ff9bdefb9799e44cab0390"} Apr 17 15:24:03.467872 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:03.467837 2567 generic.go:358] "Generic (PLEG): container finished" podID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerID="2ea49738b3c6e90289e08afb828f94ac4e5f80b675d52afe5741c75f3f0842bd" exitCode=0 Apr 17 15:24:03.468302 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:03.467934 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" event={"ID":"18052d78-c4e9-4554-b6c9-ed13890d4a6f","Type":"ContainerDied","Data":"2ea49738b3c6e90289e08afb828f94ac4e5f80b675d52afe5741c75f3f0842bd"} Apr 17 15:24:04.472443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.472399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" event={"ID":"709f8d01-1d92-4090-a398-2530cfd1ed0e","Type":"ContainerStarted","Data":"2af4c7265ad451e7ea90d0e857b034171c756371b59da03e9b47487f348de00d"} Apr 17 15:24:04.490142 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.490094 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-bcsjj" podStartSLOduration=1.5529652980000002 podStartE2EDuration="5.490078902s" podCreationTimestamp="2026-04-17 15:23:59 +0000 UTC" firstStartedPulling="2026-04-17 15:23:59.836818979 +0000 UTC m=+413.228360997" lastFinishedPulling="2026-04-17 15:24:03.773932576 +0000 UTC m=+417.165474601" observedRunningTime="2026-04-17 15:24:04.488969814 +0000 UTC m=+417.880511850" watchObservedRunningTime="2026-04-17 15:24:04.490078902 +0000 UTC m=+417.881620936" Apr 17 15:24:04.607734 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.607702 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:24:04.785827 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.785747 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvmtf\" (UniqueName: \"kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf\") pod \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " Apr 17 15:24:04.785827 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.785825 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle\") pod \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " Apr 17 15:24:04.786062 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.785866 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util\") pod \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\" (UID: \"18052d78-c4e9-4554-b6c9-ed13890d4a6f\") " Apr 17 15:24:04.786687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.786652 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle" (OuterVolumeSpecName: "bundle") pod "18052d78-c4e9-4554-b6c9-ed13890d4a6f" (UID: "18052d78-c4e9-4554-b6c9-ed13890d4a6f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:24:04.787854 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.787828 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf" (OuterVolumeSpecName: "kube-api-access-mvmtf") pod "18052d78-c4e9-4554-b6c9-ed13890d4a6f" (UID: "18052d78-c4e9-4554-b6c9-ed13890d4a6f"). InnerVolumeSpecName "kube-api-access-mvmtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:24:04.887484 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.887447 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvmtf\" (UniqueName: \"kubernetes.io/projected/18052d78-c4e9-4554-b6c9-ed13890d4a6f-kube-api-access-mvmtf\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:04.887484 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.887478 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:04.954704 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.954666 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util" (OuterVolumeSpecName: "util") pod "18052d78-c4e9-4554-b6c9-ed13890d4a6f" (UID: "18052d78-c4e9-4554-b6c9-ed13890d4a6f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:24:04.988558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:04.988527 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18052d78-c4e9-4554-b6c9-ed13890d4a6f-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:05.482491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:05.482459 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" Apr 17 15:24:05.482491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:05.482459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k9x4m" event={"ID":"18052d78-c4e9-4554-b6c9-ed13890d4a6f","Type":"ContainerDied","Data":"30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc"} Apr 17 15:24:05.482491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:05.482499 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30e9dd51638cae698591d13523d953057ff49ffbf8bd6ea5a46106ecb57ee5dc" Apr 17 15:24:09.438460 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:09.438431 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-bzprl" Apr 17 15:24:12.877568 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.877525 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq"] Apr 17 15:24:12.878047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.877993 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="util" Apr 17 15:24:12.878047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878011 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="util" Apr 17 15:24:12.878047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878023 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="pull" Apr 17 15:24:12.878047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878030 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="pull" Apr 17 15:24:12.878047 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878046 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="extract" Apr 17 15:24:12.878293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878055 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="extract" Apr 17 15:24:12.878293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.878186 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="18052d78-c4e9-4554-b6c9-ed13890d4a6f" containerName="extract" Apr 17 15:24:12.885672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.885651 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:12.889072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.889046 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x4vvd\"" Apr 17 15:24:12.889210 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.889182 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 15:24:12.889448 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.889427 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq"] Apr 17 15:24:12.890201 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.890187 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 15:24:12.946516 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.946484 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtxf\" (UniqueName: \"kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:12.946711 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.946592 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:12.946711 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:12.946681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.047035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.046998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.047190 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.047063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.047190 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.047086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtxf\" (UniqueName: \"kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.047415 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.047397 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.047456 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.047425 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.070574 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.070550 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtxf\" (UniqueName: \"kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.196380 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.196287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:13.332957 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.332932 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq"] Apr 17 15:24:13.335415 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:24:13.335386 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c36ae3e_733b_40a1_a010_69288b3789bd.slice/crio-6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db WatchSource:0}: Error finding container 6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db: Status 404 returned error can't find the container with id 6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db Apr 17 15:24:13.512244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.512161 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerID="d20099c6ec0166a92a283a33c167ccde057d0978afbe51428a73e9f430d9509d" exitCode=0 Apr 17 15:24:13.512244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.512211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" event={"ID":"6c36ae3e-733b-40a1-a010-69288b3789bd","Type":"ContainerDied","Data":"d20099c6ec0166a92a283a33c167ccde057d0978afbe51428a73e9f430d9509d"} Apr 17 15:24:13.512447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:13.512238 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" event={"ID":"6c36ae3e-733b-40a1-a010-69288b3789bd","Type":"ContainerStarted","Data":"6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db"} Apr 17 15:24:14.239791 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.239707 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q"] Apr 17 15:24:14.243121 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.243105 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.245717 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.245693 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-hsx66\"" Apr 17 15:24:14.245717 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.245713 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 15:24:14.245906 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.245808 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 15:24:14.255818 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.255795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a80ee34f-0fac-45df-b089-b061f2ce2320-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.255929 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.255831 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8wt\" (UniqueName: \"kubernetes.io/projected/a80ee34f-0fac-45df-b089-b061f2ce2320-kube-api-access-tp8wt\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.256717 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.256695 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q"] Apr 17 15:24:14.356219 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.356180 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a80ee34f-0fac-45df-b089-b061f2ce2320-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.356384 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.356233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8wt\" (UniqueName: \"kubernetes.io/projected/a80ee34f-0fac-45df-b089-b061f2ce2320-kube-api-access-tp8wt\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.358675 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.358655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a80ee34f-0fac-45df-b089-b061f2ce2320-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.364776 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.364750 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8wt\" (UniqueName: \"kubernetes.io/projected/a80ee34f-0fac-45df-b089-b061f2ce2320-kube-api-access-tp8wt\") pod \"servicemesh-operator3-55f49c5f94-wbl7q\" (UID: \"a80ee34f-0fac-45df-b089-b061f2ce2320\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.517486 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.517390 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerID="c250d675270a462fd9dfa89116ed1bc5c8a64c0c60bb494490916ceee8a51fc3" exitCode=0 Apr 17 15:24:14.517486 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.517458 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" event={"ID":"6c36ae3e-733b-40a1-a010-69288b3789bd","Type":"ContainerDied","Data":"c250d675270a462fd9dfa89116ed1bc5c8a64c0c60bb494490916ceee8a51fc3"} Apr 17 15:24:14.552647 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.552583 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:14.689642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:14.689616 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q"] Apr 17 15:24:14.692652 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:24:14.692626 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda80ee34f_0fac_45df_b089_b061f2ce2320.slice/crio-cf0b22c43e3d96acb73a45195c1890c85abab4e0e8eeb2d1cbd8663d534f72e8 WatchSource:0}: Error finding container cf0b22c43e3d96acb73a45195c1890c85abab4e0e8eeb2d1cbd8663d534f72e8: Status 404 returned error can't find the container with id cf0b22c43e3d96acb73a45195c1890c85abab4e0e8eeb2d1cbd8663d534f72e8 Apr 17 15:24:15.524440 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:15.524401 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerID="0fe9da28a6a0c0fa02aeccf726ba7e1a610c74dd7ae73b496827250dfff0458f" exitCode=0 Apr 17 15:24:15.524879 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:15.524493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" event={"ID":"6c36ae3e-733b-40a1-a010-69288b3789bd","Type":"ContainerDied","Data":"0fe9da28a6a0c0fa02aeccf726ba7e1a610c74dd7ae73b496827250dfff0458f"} Apr 17 15:24:15.526333 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:15.526293 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" event={"ID":"a80ee34f-0fac-45df-b089-b061f2ce2320","Type":"ContainerStarted","Data":"cf0b22c43e3d96acb73a45195c1890c85abab4e0e8eeb2d1cbd8663d534f72e8"} Apr 17 15:24:16.974034 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.974014 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:16.982058 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.982036 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util\") pod \"6c36ae3e-733b-40a1-a010-69288b3789bd\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " Apr 17 15:24:16.982181 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.982079 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle\") pod \"6c36ae3e-733b-40a1-a010-69288b3789bd\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " Apr 17 15:24:16.982181 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.982128 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdtxf\" (UniqueName: \"kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf\") pod \"6c36ae3e-733b-40a1-a010-69288b3789bd\" (UID: \"6c36ae3e-733b-40a1-a010-69288b3789bd\") " Apr 17 15:24:16.982932 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.982907 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle" (OuterVolumeSpecName: "bundle") pod "6c36ae3e-733b-40a1-a010-69288b3789bd" (UID: "6c36ae3e-733b-40a1-a010-69288b3789bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:24:16.984237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.984206 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf" (OuterVolumeSpecName: "kube-api-access-cdtxf") pod "6c36ae3e-733b-40a1-a010-69288b3789bd" (UID: "6c36ae3e-733b-40a1-a010-69288b3789bd"). InnerVolumeSpecName "kube-api-access-cdtxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:24:16.989592 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:16.989553 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util" (OuterVolumeSpecName: "util") pod "6c36ae3e-733b-40a1-a010-69288b3789bd" (UID: "6c36ae3e-733b-40a1-a010-69288b3789bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:24:17.083807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.083763 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:17.083807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.083802 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdtxf\" (UniqueName: \"kubernetes.io/projected/6c36ae3e-733b-40a1-a010-69288b3789bd-kube-api-access-cdtxf\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:17.084051 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.083817 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c36ae3e-733b-40a1-a010-69288b3789bd-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:24:17.541785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.541741 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" event={"ID":"6c36ae3e-733b-40a1-a010-69288b3789bd","Type":"ContainerDied","Data":"6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db"} Apr 17 15:24:17.541785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.541784 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa8207696170d32d73034e3ee2b6b8b2df1b05e8ed5841ae39dc83127f008db" Apr 17 15:24:17.542335 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.541791 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hc4zq" Apr 17 15:24:17.543269 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.543244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" event={"ID":"a80ee34f-0fac-45df-b089-b061f2ce2320","Type":"ContainerStarted","Data":"aeac59f1053845981ce0ea7d64f142f0887cb356580640bc7514d9eb2e3562fc"} Apr 17 15:24:17.543426 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.543414 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:17.563908 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:17.563863 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" podStartSLOduration=1.238808403 podStartE2EDuration="3.563852403s" podCreationTimestamp="2026-04-17 15:24:14 +0000 UTC" firstStartedPulling="2026-04-17 15:24:14.695229679 +0000 UTC m=+428.086771695" lastFinishedPulling="2026-04-17 15:24:17.020273682 +0000 UTC m=+430.411815695" observedRunningTime="2026-04-17 15:24:17.560707907 +0000 UTC m=+430.952249942" watchObservedRunningTime="2026-04-17 15:24:17.563852403 +0000 UTC m=+430.955394437" Apr 17 15:24:25.800236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800163 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7"] Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800570 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="util" Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800583 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="util" Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800594 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="pull" Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800600 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="pull" Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800609 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="extract" Apr 17 15:24:25.800618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800615 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="extract" Apr 17 15:24:25.800800 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.800674 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c36ae3e-733b-40a1-a010-69288b3789bd" containerName="extract" Apr 17 15:24:25.803977 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.803956 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.806568 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.806542 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 15:24:25.806687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.806579 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 15:24:25.806687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.806542 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 15:24:25.806687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.806542 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 15:24:25.806826 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.806584 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-gn8lr\"" Apr 17 15:24:25.814791 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.814771 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7"] Apr 17 15:24:25.856347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856302 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856471 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856471 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856471 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856485 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856526 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.856632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.856572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f26tk\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-kube-api-access-f26tk\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957730 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957743 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.957851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.957842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.958096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.958063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f26tk\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-kube-api-access-f26tk\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.958517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.958490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.960336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.960290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.960564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.960546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.960643 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.960576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.960643 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.960599 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.965371 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.965343 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:25.965592 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:25.965571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f26tk\" (UniqueName: \"kubernetes.io/projected/31b7a54e-f948-41b1-8ab5-1edb1c1f74e0-kube-api-access-f26tk\") pod \"istiod-openshift-gateway-55ff986f96-zcfm7\" (UID: \"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:26.114270 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:26.114181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:26.242637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:26.242571 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7"] Apr 17 15:24:26.245487 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:24:26.245456 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b7a54e_f948_41b1_8ab5_1edb1c1f74e0.slice/crio-87ac60d56013fe2e5970bdd1aa22be93f16ce2850caa9fc0445a23973564fc87 WatchSource:0}: Error finding container 87ac60d56013fe2e5970bdd1aa22be93f16ce2850caa9fc0445a23973564fc87: Status 404 returned error can't find the container with id 87ac60d56013fe2e5970bdd1aa22be93f16ce2850caa9fc0445a23973564fc87 Apr 17 15:24:26.580624 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:26.580583 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" event={"ID":"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0","Type":"ContainerStarted","Data":"87ac60d56013fe2e5970bdd1aa22be93f16ce2850caa9fc0445a23973564fc87"} Apr 17 15:24:28.540293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:28.540258 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 15:24:28.540723 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:28.540352 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 15:24:28.549374 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:28.549353 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wbl7q" Apr 17 15:24:29.594110 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:29.594078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" event={"ID":"31b7a54e-f948-41b1-8ab5-1edb1c1f74e0","Type":"ContainerStarted","Data":"5e58e63e1a0f396ef9468fedc9f44a846b2e402eb219b1ed6ca2e1b8996a013d"} Apr 17 15:24:29.594640 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:29.594221 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:29.595878 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:29.595842 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-zcfm7 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 15:24:29.595984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:29.595885 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" podUID="31b7a54e-f948-41b1-8ab5-1edb1c1f74e0" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 15:24:29.625296 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:29.625246 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" podStartSLOduration=2.332637555 podStartE2EDuration="4.625232557s" podCreationTimestamp="2026-04-17 15:24:25 +0000 UTC" firstStartedPulling="2026-04-17 15:24:26.247417389 +0000 UTC m=+439.638959402" lastFinishedPulling="2026-04-17 15:24:28.540012387 +0000 UTC m=+441.931554404" observedRunningTime="2026-04-17 15:24:29.622666563 +0000 UTC m=+443.014208615" watchObservedRunningTime="2026-04-17 15:24:29.625232557 +0000 UTC m=+443.016774592" Apr 17 15:24:30.599142 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:30.599113 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zcfm7" Apr 17 15:24:32.467564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:24:32.467527 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-c66kx" Apr 17 15:25:01.295880 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.295844 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk"] Apr 17 15:25:01.298249 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.298232 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.300705 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.300680 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 15:25:01.300820 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.300678 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 15:25:01.300820 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.300811 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7w66g\"" Apr 17 15:25:01.305491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.305468 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk"] Apr 17 15:25:01.463048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.463000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7sg8\" (UniqueName: \"kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.463048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.463050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.463340 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.463192 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.564084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.564000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7sg8\" (UniqueName: \"kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.564084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.564042 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.564293 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.564099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.564472 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.564451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.564529 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.564482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.572751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.572723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7sg8\" (UniqueName: \"kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.608517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.608488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:01.735021 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.734987 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk"] Apr 17 15:25:01.737990 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:01.737962 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99383d9_65b6_4ee3_ab0d_6c0ec759718c.slice/crio-9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c WatchSource:0}: Error finding container 9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c: Status 404 returned error can't find the container with id 9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c Apr 17 15:25:01.888192 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.888169 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64"] Apr 17 15:25:01.892938 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.892922 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:01.898223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:01.898204 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64"] Apr 17 15:25:02.068730 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.068692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.068730 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.068729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fng9f\" (UniqueName: \"kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.068939 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.068849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.169672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.169638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.169672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.169676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fng9f\" (UniqueName: \"kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.169863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.169742 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.170092 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.170069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.170138 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.170073 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.177877 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.177849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fng9f\" (UniqueName: \"kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.219518 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.219495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:02.338239 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.338214 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64"] Apr 17 15:25:02.339115 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:02.339085 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcccdfbc3_a737_4bd7_8b48_d6c109a8e987.slice/crio-bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125 WatchSource:0}: Error finding container bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125: Status 404 returned error can't find the container with id bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125 Apr 17 15:25:02.493393 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.493356 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98"] Apr 17 15:25:02.495878 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.495856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.503936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.503913 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98"] Apr 17 15:25:02.580230 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.580188 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5s8\" (UniqueName: \"kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.580426 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.580264 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.580426 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.580332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.681032 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.680958 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.681032 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.680999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.681197 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.681048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5s8\" (UniqueName: \"kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.681422 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.681399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.681470 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.681407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.689083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.689055 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5s8\" (UniqueName: \"kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.723379 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.723348 2567 generic.go:358] "Generic (PLEG): container finished" podID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerID="b7075dd1816349f025dcdaf410bc482b54056ae21280e2b158c0c77d6e3b7824" exitCode=0 Apr 17 15:25:02.723502 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.723441 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" event={"ID":"cccdfbc3-a737-4bd7-8b48-d6c109a8e987","Type":"ContainerDied","Data":"b7075dd1816349f025dcdaf410bc482b54056ae21280e2b158c0c77d6e3b7824"} Apr 17 15:25:02.723502 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.723476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" event={"ID":"cccdfbc3-a737-4bd7-8b48-d6c109a8e987","Type":"ContainerStarted","Data":"bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125"} Apr 17 15:25:02.725001 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.724978 2567 generic.go:358] "Generic (PLEG): container finished" podID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerID="07518b56aaf9dd52afb11a635b50ca58a16b99eeda90cab2e3910a5fb8810db6" exitCode=0 Apr 17 15:25:02.725091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.725043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" event={"ID":"a99383d9-65b6-4ee3-ab0d-6c0ec759718c","Type":"ContainerDied","Data":"07518b56aaf9dd52afb11a635b50ca58a16b99eeda90cab2e3910a5fb8810db6"} Apr 17 15:25:02.725091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.725073 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" event={"ID":"a99383d9-65b6-4ee3-ab0d-6c0ec759718c","Type":"ContainerStarted","Data":"9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c"} Apr 17 15:25:02.806291 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.806262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:02.927149 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:02.927120 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98"] Apr 17 15:25:02.928249 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:02.928224 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf6f2c5_ccf3_4836_b98e_e30127cf91f8.slice/crio-fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532 WatchSource:0}: Error finding container fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532: Status 404 returned error can't find the container with id fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532 Apr 17 15:25:03.093899 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.093870 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4"] Apr 17 15:25:03.096365 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.096349 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.104230 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.104207 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4"] Apr 17 15:25:03.185044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.184955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.185044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.185008 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.185272 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.185147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r6q\" (UniqueName: \"kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.286226 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.286185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.286435 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.286237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.286435 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.286335 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r6q\" (UniqueName: \"kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.286636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.286617 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.286679 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.286659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.294958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.294932 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r6q\" (UniqueName: \"kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.406680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.406656 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:03.533747 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.533715 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4"] Apr 17 15:25:03.576614 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:03.576582 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a35ca9_255c_451b_9bad_8be8bf2a870e.slice/crio-161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10 WatchSource:0}: Error finding container 161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10: Status 404 returned error can't find the container with id 161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10 Apr 17 15:25:03.730965 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.730935 2567 generic.go:358] "Generic (PLEG): container finished" podID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerID="db27d544a7464e54c39ac4d11285ca5a4ad2a3ba4eacf96ab2b8a516b9f9f3f2" exitCode=0 Apr 17 15:25:03.731069 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.731013 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" event={"ID":"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8","Type":"ContainerDied","Data":"db27d544a7464e54c39ac4d11285ca5a4ad2a3ba4eacf96ab2b8a516b9f9f3f2"} Apr 17 15:25:03.731069 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.731051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" event={"ID":"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8","Type":"ContainerStarted","Data":"fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532"} Apr 17 15:25:03.732778 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.732760 2567 generic.go:358] "Generic (PLEG): container finished" podID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerID="f705e6158e3562c23457c157c3c20cc1c6d1a330ae5e5284bc87a370c7e80c52" exitCode=0 Apr 17 15:25:03.732874 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.732828 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" event={"ID":"cccdfbc3-a737-4bd7-8b48-d6c109a8e987","Type":"ContainerDied","Data":"f705e6158e3562c23457c157c3c20cc1c6d1a330ae5e5284bc87a370c7e80c52"} Apr 17 15:25:03.734689 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.734495 2567 generic.go:358] "Generic (PLEG): container finished" podID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerID="30623195d1b7215d80071e0a44e9a4030e69173fca7af2ecd7f33d09005481a0" exitCode=0 Apr 17 15:25:03.734689 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.734576 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" event={"ID":"a99383d9-65b6-4ee3-ab0d-6c0ec759718c","Type":"ContainerDied","Data":"30623195d1b7215d80071e0a44e9a4030e69173fca7af2ecd7f33d09005481a0"} Apr 17 15:25:03.736091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.736000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerStarted","Data":"74e12a973fccf5286ae4847ff816c6d0792d25273e8a424e331794d53c755c85"} Apr 17 15:25:03.736091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:03.736024 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerStarted","Data":"161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10"} Apr 17 15:25:04.742457 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.742431 2567 generic.go:358] "Generic (PLEG): container finished" podID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerID="f8681952375c3cb38cb95c114a0afb88615985a9e010b970f2f3565a7848ae6f" exitCode=0 Apr 17 15:25:04.742845 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.742528 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" event={"ID":"a99383d9-65b6-4ee3-ab0d-6c0ec759718c","Type":"ContainerDied","Data":"f8681952375c3cb38cb95c114a0afb88615985a9e010b970f2f3565a7848ae6f"} Apr 17 15:25:04.745352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.744665 2567 generic.go:358] "Generic (PLEG): container finished" podID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerID="74e12a973fccf5286ae4847ff816c6d0792d25273e8a424e331794d53c755c85" exitCode=0 Apr 17 15:25:04.745352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.744688 2567 generic.go:358] "Generic (PLEG): container finished" podID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerID="f4af3e3774f4c445a6bf8f751d1c74b0c74c12d68510d890016567d05596eb98" exitCode=0 Apr 17 15:25:04.745352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.744835 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerDied","Data":"74e12a973fccf5286ae4847ff816c6d0792d25273e8a424e331794d53c755c85"} Apr 17 15:25:04.745352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.744857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerDied","Data":"f4af3e3774f4c445a6bf8f751d1c74b0c74c12d68510d890016567d05596eb98"} Apr 17 15:25:04.749570 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.749544 2567 generic.go:358] "Generic (PLEG): container finished" podID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerID="ee087e67145408ec8e7aee920be8845b2db6560f19cab51f0fd7997fc3250275" exitCode=0 Apr 17 15:25:04.749674 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:04.749578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" event={"ID":"cccdfbc3-a737-4bd7-8b48-d6c109a8e987","Type":"ContainerDied","Data":"ee087e67145408ec8e7aee920be8845b2db6560f19cab51f0fd7997fc3250275"} Apr 17 15:25:05.755468 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.755434 2567 generic.go:358] "Generic (PLEG): container finished" podID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerID="c25ba228197aa1dd2422bb3ef6f7007d89e5780c1461beb64c801697c96f8fcb" exitCode=0 Apr 17 15:25:05.755847 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.755526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" event={"ID":"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8","Type":"ContainerDied","Data":"c25ba228197aa1dd2422bb3ef6f7007d89e5780c1461beb64c801697c96f8fcb"} Apr 17 15:25:05.757532 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.757509 2567 generic.go:358] "Generic (PLEG): container finished" podID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerID="f2ac17485d6cd3cae5ce238d6c2fad580876184641189feb9493d07e8fa6a601" exitCode=0 Apr 17 15:25:05.757598 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.757582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerDied","Data":"f2ac17485d6cd3cae5ce238d6c2fad580876184641189feb9493d07e8fa6a601"} Apr 17 15:25:05.943220 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.943200 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:05.946426 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:05.946407 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:06.011335 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011285 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle\") pod \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " Apr 17 15:25:06.011480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011349 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle\") pod \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " Apr 17 15:25:06.011480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011377 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fng9f\" (UniqueName: \"kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f\") pod \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " Apr 17 15:25:06.011480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011401 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util\") pod \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " Apr 17 15:25:06.011480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011443 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7sg8\" (UniqueName: \"kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8\") pod \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\" (UID: \"a99383d9-65b6-4ee3-ab0d-6c0ec759718c\") " Apr 17 15:25:06.011480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011466 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util\") pod \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\" (UID: \"cccdfbc3-a737-4bd7-8b48-d6c109a8e987\") " Apr 17 15:25:06.011878 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.011851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle" (OuterVolumeSpecName: "bundle") pod "a99383d9-65b6-4ee3-ab0d-6c0ec759718c" (UID: "a99383d9-65b6-4ee3-ab0d-6c0ec759718c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:06.012153 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.012128 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle" (OuterVolumeSpecName: "bundle") pod "cccdfbc3-a737-4bd7-8b48-d6c109a8e987" (UID: "cccdfbc3-a737-4bd7-8b48-d6c109a8e987"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:06.013783 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.013754 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f" (OuterVolumeSpecName: "kube-api-access-fng9f") pod "cccdfbc3-a737-4bd7-8b48-d6c109a8e987" (UID: "cccdfbc3-a737-4bd7-8b48-d6c109a8e987"). InnerVolumeSpecName "kube-api-access-fng9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:06.013874 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.013778 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8" (OuterVolumeSpecName: "kube-api-access-c7sg8") pod "a99383d9-65b6-4ee3-ab0d-6c0ec759718c" (UID: "a99383d9-65b6-4ee3-ab0d-6c0ec759718c"). InnerVolumeSpecName "kube-api-access-c7sg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:06.019726 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.019699 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util" (OuterVolumeSpecName: "util") pod "cccdfbc3-a737-4bd7-8b48-d6c109a8e987" (UID: "cccdfbc3-a737-4bd7-8b48-d6c109a8e987"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:06.020207 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.020190 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util" (OuterVolumeSpecName: "util") pod "a99383d9-65b6-4ee3-ab0d-6c0ec759718c" (UID: "a99383d9-65b6-4ee3-ab0d-6c0ec759718c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:06.112934 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112860 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.112934 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112893 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7sg8\" (UniqueName: \"kubernetes.io/projected/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-kube-api-access-c7sg8\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.112934 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112909 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.112934 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112924 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a99383d9-65b6-4ee3-ab0d-6c0ec759718c-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.113153 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112938 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.113153 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.112952 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fng9f\" (UniqueName: \"kubernetes.io/projected/cccdfbc3-a737-4bd7-8b48-d6c109a8e987-kube-api-access-fng9f\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:06.763671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.763638 2567 generic.go:358] "Generic (PLEG): container finished" podID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerID="be18c658cd30330f85c0feb7c9925244ecd2e5947d3cc3e97c2558e42583da65" exitCode=0 Apr 17 15:25:06.764220 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.763733 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" event={"ID":"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8","Type":"ContainerDied","Data":"be18c658cd30330f85c0feb7c9925244ecd2e5947d3cc3e97c2558e42583da65"} Apr 17 15:25:06.765453 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.765425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" event={"ID":"cccdfbc3-a737-4bd7-8b48-d6c109a8e987","Type":"ContainerDied","Data":"bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125"} Apr 17 15:25:06.765589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.765457 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1509dfb64714be3c5349ab2d20c64f089300cad9f33eab2df0912f2e413125" Apr 17 15:25:06.765589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.765434 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64" Apr 17 15:25:06.767202 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.767180 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" Apr 17 15:25:06.767202 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.767190 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk" event={"ID":"a99383d9-65b6-4ee3-ab0d-6c0ec759718c","Type":"ContainerDied","Data":"9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c"} Apr 17 15:25:06.767386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.767217 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d35504f7483437f90482ac3e4f6f3e21ba21af50e61cf4a198185976bf79c8c" Apr 17 15:25:06.895814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.895793 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:06.919400 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.919379 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util\") pod \"37a35ca9-255c-451b-9bad-8be8bf2a870e\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " Apr 17 15:25:06.919536 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.919431 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9r6q\" (UniqueName: \"kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q\") pod \"37a35ca9-255c-451b-9bad-8be8bf2a870e\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " Apr 17 15:25:06.919536 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.919460 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle\") pod \"37a35ca9-255c-451b-9bad-8be8bf2a870e\" (UID: \"37a35ca9-255c-451b-9bad-8be8bf2a870e\") " Apr 17 15:25:06.920036 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.920002 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle" (OuterVolumeSpecName: "bundle") pod "37a35ca9-255c-451b-9bad-8be8bf2a870e" (UID: "37a35ca9-255c-451b-9bad-8be8bf2a870e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:06.921473 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.921449 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q" (OuterVolumeSpecName: "kube-api-access-b9r6q") pod "37a35ca9-255c-451b-9bad-8be8bf2a870e" (UID: "37a35ca9-255c-451b-9bad-8be8bf2a870e"). InnerVolumeSpecName "kube-api-access-b9r6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:06.926035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:06.925990 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util" (OuterVolumeSpecName: "util") pod "37a35ca9-255c-451b-9bad-8be8bf2a870e" (UID: "37a35ca9-255c-451b-9bad-8be8bf2a870e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:07.020170 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.020132 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:07.020170 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.020165 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9r6q\" (UniqueName: \"kubernetes.io/projected/37a35ca9-255c-451b-9bad-8be8bf2a870e-kube-api-access-b9r6q\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:07.020170 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.020175 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37a35ca9-255c-451b-9bad-8be8bf2a870e-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:07.773078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.773038 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" event={"ID":"37a35ca9-255c-451b-9bad-8be8bf2a870e","Type":"ContainerDied","Data":"161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10"} Apr 17 15:25:07.773078 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.773084 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="161ec22013f8ef0814905acda2c71bbe5ea632906f7bbd9defca723228b41d10" Apr 17 15:25:07.773526 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.773052 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4" Apr 17 15:25:07.896945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.896922 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:07.932085 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.932058 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle\") pod \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " Apr 17 15:25:07.932266 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.932110 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5s8\" (UniqueName: \"kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8\") pod \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " Apr 17 15:25:07.932266 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.932156 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util\") pod \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\" (UID: \"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8\") " Apr 17 15:25:07.932758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.932716 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle" (OuterVolumeSpecName: "bundle") pod "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" (UID: "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:07.934276 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.934250 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8" (OuterVolumeSpecName: "kube-api-access-8g5s8") pod "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" (UID: "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8"). InnerVolumeSpecName "kube-api-access-8g5s8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:07.937428 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:07.937407 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util" (OuterVolumeSpecName: "util") pod "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" (UID: "3cf6f2c5-ccf3-4836-b98e-e30127cf91f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:08.033169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.033078 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-util\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:08.033169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.033112 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:08.033169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.033121 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8g5s8\" (UniqueName: \"kubernetes.io/projected/3cf6f2c5-ccf3-4836-b98e-e30127cf91f8-kube-api-access-8g5s8\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:08.778004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.777971 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" Apr 17 15:25:08.778004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.777973 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98" event={"ID":"3cf6f2c5-ccf3-4836-b98e-e30127cf91f8","Type":"ContainerDied","Data":"fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532"} Apr 17 15:25:08.778004 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:08.778007 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb610f8ef2a82d8ff1ddb53622d297aea2e42f249c883eb933fc38f0e3e15532" Apr 17 15:25:25.033298 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033263 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ts8sf"] Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033660 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="util" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033673 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="util" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033685 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033691 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033699 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="util" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033704 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="util" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033711 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="extract" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033717 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="extract" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033725 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033730 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033735 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033740 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="pull" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033747 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="extract" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033752 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="extract" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033760 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="util" Apr 17 15:25:25.033757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033766 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="util" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033774 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033779 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033789 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="util" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033793 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="util" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033799 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="pull" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033804 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="pull" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033812 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033816 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033878 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="37a35ca9-255c-451b-9bad-8be8bf2a870e" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033887 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a99383d9-65b6-4ee3-ab0d-6c0ec759718c" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033892 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cf6f2c5-ccf3-4836-b98e-e30127cf91f8" containerName="extract" Apr 17 15:25:25.034412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.033899 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="cccdfbc3-a737-4bd7-8b48-d6c109a8e987" containerName="extract" Apr 17 15:25:25.045929 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.045901 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:25.048574 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.048549 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4ppqc\"" Apr 17 15:25:25.048709 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.048549 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 15:25:25.048936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.048919 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ts8sf"] Apr 17 15:25:25.049508 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.049491 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 15:25:25.174993 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.174965 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42bk\" (UniqueName: \"kubernetes.io/projected/761f932c-6cb0-40a7-a949-ab44709b017d-kube-api-access-q42bk\") pod \"authorino-operator-657f44b778-ts8sf\" (UID: \"761f932c-6cb0-40a7-a949-ab44709b017d\") " pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:25.276291 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.276257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q42bk\" (UniqueName: \"kubernetes.io/projected/761f932c-6cb0-40a7-a949-ab44709b017d-kube-api-access-q42bk\") pod \"authorino-operator-657f44b778-ts8sf\" (UID: \"761f932c-6cb0-40a7-a949-ab44709b017d\") " pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:25.295232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.295165 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42bk\" (UniqueName: \"kubernetes.io/projected/761f932c-6cb0-40a7-a949-ab44709b017d-kube-api-access-q42bk\") pod \"authorino-operator-657f44b778-ts8sf\" (UID: \"761f932c-6cb0-40a7-a949-ab44709b017d\") " pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:25.357197 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.357167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:25.485729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.485702 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ts8sf"] Apr 17 15:25:25.487678 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:25.487637 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761f932c_6cb0_40a7_a949_ab44709b017d.slice/crio-d85363ca3380cb91a14c51cbe90b813e5fd8a729b103aaa36a14c13b61954c15 WatchSource:0}: Error finding container d85363ca3380cb91a14c51cbe90b813e5fd8a729b103aaa36a14c13b61954c15: Status 404 returned error can't find the container with id d85363ca3380cb91a14c51cbe90b813e5fd8a729b103aaa36a14c13b61954c15 Apr 17 15:25:25.845656 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:25.845623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" event={"ID":"761f932c-6cb0-40a7-a949-ab44709b017d","Type":"ContainerStarted","Data":"d85363ca3380cb91a14c51cbe90b813e5fd8a729b103aaa36a14c13b61954c15"} Apr 17 15:25:27.855220 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:27.855177 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" event={"ID":"761f932c-6cb0-40a7-a949-ab44709b017d","Type":"ContainerStarted","Data":"9179c658cf350e1eff1c510c9ca552de5fe0cf4f70757ec7d132959f059acaf4"} Apr 17 15:25:27.855632 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:27.855335 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:27.871973 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:27.871930 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" podStartSLOduration=1.127833523 podStartE2EDuration="2.871919118s" podCreationTimestamp="2026-04-17 15:25:25 +0000 UTC" firstStartedPulling="2026-04-17 15:25:25.489370167 +0000 UTC m=+498.880912180" lastFinishedPulling="2026-04-17 15:25:27.233455761 +0000 UTC m=+500.624997775" observedRunningTime="2026-04-17 15:25:27.87072099 +0000 UTC m=+501.262263037" watchObservedRunningTime="2026-04-17 15:25:27.871919118 +0000 UTC m=+501.263461153" Apr 17 15:25:37.637932 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:37.637898 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:25:38.861491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:38.861457 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-ts8sf" Apr 17 15:25:48.471237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.471202 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv"] Apr 17 15:25:48.476121 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.476100 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.478571 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.478555 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6gthh\"" Apr 17 15:25:48.486366 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.486344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv"] Apr 17 15:25:48.576927 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.576896 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.577091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.576954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qz5t\" (UniqueName: \"kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.677387 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.677355 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.677562 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.677417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qz5t\" (UniqueName: \"kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.677747 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.677725 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.687158 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.687135 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qz5t\" (UniqueName: \"kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.787849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.787758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:48.915617 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.915592 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv"] Apr 17 15:25:48.918168 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:48.918137 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10433d8_f189_4435_8e9b_882c1fbb9eca.slice/crio-622e60b43a79786792637bdce892d5282d3294792fa34d21ccc19a4852fda7b7 WatchSource:0}: Error finding container 622e60b43a79786792637bdce892d5282d3294792fa34d21ccc19a4852fda7b7: Status 404 returned error can't find the container with id 622e60b43a79786792637bdce892d5282d3294792fa34d21ccc19a4852fda7b7 Apr 17 15:25:48.938504 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:48.938477 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" event={"ID":"d10433d8-f189-4435-8e9b-882c1fbb9eca","Type":"ContainerStarted","Data":"622e60b43a79786792637bdce892d5282d3294792fa34d21ccc19a4852fda7b7"} Apr 17 15:25:49.221295 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.221251 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv"] Apr 17 15:25:49.228930 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.228896 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv"] Apr 17 15:25:49.245781 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.245753 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:25:49.250621 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.250600 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.263915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.263881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:25:49.388891 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.388852 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zb6m\" (UniqueName: \"kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.389068 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.389015 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.490030 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.489959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.490030 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.490006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zb6m\" (UniqueName: \"kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.490413 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.490339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.503252 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.503213 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zb6m\" (UniqueName: \"kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kckqm\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.564901 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.564867 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:49.799171 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.799140 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:25:49.911489 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:25:49.911454 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02bb9519_102c_4721_aa4e_91b788176012.slice/crio-05604f5794787efb82019bbc696d57e4fcfdfc3c5f736250e1b840c730f76514 WatchSource:0}: Error finding container 05604f5794787efb82019bbc696d57e4fcfdfc3c5f736250e1b840c730f76514: Status 404 returned error can't find the container with id 05604f5794787efb82019bbc696d57e4fcfdfc3c5f736250e1b840c730f76514 Apr 17 15:25:49.944537 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:49.944500 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" event={"ID":"02bb9519-102c-4721-aa4e-91b788176012","Type":"ContainerStarted","Data":"05604f5794787efb82019bbc696d57e4fcfdfc3c5f736250e1b840c730f76514"} Apr 17 15:25:53.964872 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:53.964814 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" containerName="manager" containerID="cri-o://3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42" gracePeriod=2 Apr 17 15:25:53.966578 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:53.966543 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" event={"ID":"02bb9519-102c-4721-aa4e-91b788176012","Type":"ContainerStarted","Data":"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c"} Apr 17 15:25:53.966717 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:53.966627 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:25:53.995461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:53.995420 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" podStartSLOduration=1.604425757 podStartE2EDuration="4.995403089s" podCreationTimestamp="2026-04-17 15:25:49 +0000 UTC" firstStartedPulling="2026-04-17 15:25:49.914779268 +0000 UTC m=+523.306321284" lastFinishedPulling="2026-04-17 15:25:53.3057566 +0000 UTC m=+526.697298616" observedRunningTime="2026-04-17 15:25:53.993425086 +0000 UTC m=+527.384967120" watchObservedRunningTime="2026-04-17 15:25:53.995403089 +0000 UTC m=+527.386945129" Apr 17 15:25:54.202009 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.201985 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:54.204201 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.204179 2567 status_manager.go:895] "Failed to get status for pod" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" is forbidden: User \"system:node:ip-10-0-130-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-92.ec2.internal' and this object" Apr 17 15:25:54.338998 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.338923 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume\") pod \"d10433d8-f189-4435-8e9b-882c1fbb9eca\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " Apr 17 15:25:54.338998 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.338977 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qz5t\" (UniqueName: \"kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t\") pod \"d10433d8-f189-4435-8e9b-882c1fbb9eca\" (UID: \"d10433d8-f189-4435-8e9b-882c1fbb9eca\") " Apr 17 15:25:54.339241 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.339214 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "d10433d8-f189-4435-8e9b-882c1fbb9eca" (UID: "d10433d8-f189-4435-8e9b-882c1fbb9eca"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:54.340973 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.340944 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t" (OuterVolumeSpecName: "kube-api-access-8qz5t") pod "d10433d8-f189-4435-8e9b-882c1fbb9eca" (UID: "d10433d8-f189-4435-8e9b-882c1fbb9eca"). InnerVolumeSpecName "kube-api-access-8qz5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:54.440179 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.440143 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d10433d8-f189-4435-8e9b-882c1fbb9eca-extensions-socket-volume\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:54.440179 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.440174 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qz5t\" (UniqueName: \"kubernetes.io/projected/d10433d8-f189-4435-8e9b-882c1fbb9eca-kube-api-access-8qz5t\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:25:54.972215 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.972172 2567 generic.go:358] "Generic (PLEG): container finished" podID="d10433d8-f189-4435-8e9b-882c1fbb9eca" containerID="3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42" exitCode=2 Apr 17 15:25:54.972660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.972224 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" Apr 17 15:25:54.972660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.972270 2567 scope.go:117] "RemoveContainer" containerID="3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42" Apr 17 15:25:54.974621 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.974595 2567 status_manager.go:895] "Failed to get status for pod" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" is forbidden: User \"system:node:ip-10-0-130-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-92.ec2.internal' and this object" Apr 17 15:25:54.981193 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.981170 2567 scope.go:117] "RemoveContainer" containerID="3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42" Apr 17 15:25:54.981461 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:25:54.981443 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42\": container with ID starting with 3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42 not found: ID does not exist" containerID="3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42" Apr 17 15:25:54.981531 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.981470 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42"} err="failed to get container status \"3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42\": rpc error: code = NotFound desc = could not find container \"3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42\": container with ID starting with 3016a28f2bbddb83a251d44e16f885161ee4b6137f4e45ee6937fa4a8cd17c42 not found: ID does not exist" Apr 17 15:25:54.982437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:54.982413 2567 status_manager.go:895] "Failed to get status for pod" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-w5zbv\" is forbidden: User \"system:node:ip-10-0-130-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-92.ec2.internal' and this object" Apr 17 15:25:55.113859 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:25:55.113826 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" path="/var/lib/kubelet/pods/d10433d8-f189-4435-8e9b-882c1fbb9eca/volumes" Apr 17 15:26:02.657417 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:02.657289 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f8d6cbd49-hvvm6" podUID="c9e80a9a-382c-4794-a077-7b9b4a747f03" containerName="console" containerID="cri-o://349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867" gracePeriod=15 Apr 17 15:26:02.907801 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:02.907741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f8d6cbd49-hvvm6_c9e80a9a-382c-4794-a077-7b9b4a747f03/console/0.log" Apr 17 15:26:02.907901 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:02.907804 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:26:03.004073 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004039 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7pk9\" (UniqueName: \"kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.004243 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004106 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.004295 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004266 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.004366 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004345 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.004424 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004390 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.004597 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004498 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config" (OuterVolumeSpecName: "console-config") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:26:03.004781 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004760 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:26:03.004851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004794 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca" (OuterVolumeSpecName: "service-ca") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:26:03.004851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.004829 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:26:03.006332 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006290 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9" (OuterVolumeSpecName: "kube-api-access-h7pk9") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "kube-api-access-h7pk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:26:03.006535 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006520 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f8d6cbd49-hvvm6_c9e80a9a-382c-4794-a077-7b9b4a747f03/console/0.log" Apr 17 15:26:03.006595 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006555 2567 generic.go:358] "Generic (PLEG): container finished" podID="c9e80a9a-382c-4794-a077-7b9b4a747f03" containerID="349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867" exitCode=2 Apr 17 15:26:03.006637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006621 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f8d6cbd49-hvvm6" Apr 17 15:26:03.006667 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f8d6cbd49-hvvm6" event={"ID":"c9e80a9a-382c-4794-a077-7b9b4a747f03","Type":"ContainerDied","Data":"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867"} Apr 17 15:26:03.006700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006674 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f8d6cbd49-hvvm6" event={"ID":"c9e80a9a-382c-4794-a077-7b9b4a747f03","Type":"ContainerDied","Data":"5521f81d54240439076a0c0546bb9f27ae7d940300605ae38d050d08c153e266"} Apr 17 15:26:03.006700 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.006692 2567 scope.go:117] "RemoveContainer" containerID="349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867" Apr 17 15:26:03.017652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.017635 2567 scope.go:117] "RemoveContainer" containerID="349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867" Apr 17 15:26:03.017922 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:26:03.017901 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867\": container with ID starting with 349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867 not found: ID does not exist" containerID="349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867" Apr 17 15:26:03.018020 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.017928 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867"} err="failed to get container status \"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867\": rpc error: code = NotFound desc = could not find container \"349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867\": container with ID starting with 349d9a7eb5c57e3e2ed9b62fb32916a1f13d18f52f8069c6bb08717e702fd867 not found: ID does not exist" Apr 17 15:26:03.105352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105303 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.105352 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105353 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config\") pod \"c9e80a9a-382c-4794-a077-7b9b4a747f03\" (UID: \"c9e80a9a-382c-4794-a077-7b9b4a747f03\") " Apr 17 15:26:03.105612 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105594 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-service-ca\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.105652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105621 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-oauth-serving-cert\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.105652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105639 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7pk9\" (UniqueName: \"kubernetes.io/projected/c9e80a9a-382c-4794-a077-7b9b4a747f03-kube-api-access-h7pk9\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.105727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105653 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.105727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.105670 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9e80a9a-382c-4794-a077-7b9b4a747f03-trusted-ca-bundle\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.107437 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.107414 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:26:03.107515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.107478 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c9e80a9a-382c-4794-a077-7b9b4a747f03" (UID: "c9e80a9a-382c-4794-a077-7b9b4a747f03"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:26:03.206648 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.206586 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-serving-cert\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.206648 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.206609 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9e80a9a-382c-4794-a077-7b9b4a747f03-console-oauth-config\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:03.322430 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.322392 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:26:03.325942 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:03.325913 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f8d6cbd49-hvvm6"] Apr 17 15:26:04.975457 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:04.975427 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:26:05.113891 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:05.113862 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e80a9a-382c-4794-a077-7b9b4a747f03" path="/var/lib/kubelet/pods/c9e80a9a-382c-4794-a077-7b9b4a747f03/volumes" Apr 17 15:26:23.052569 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.052532 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:26:23.052954 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.052768 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" podUID="02bb9519-102c-4721-aa4e-91b788176012" containerName="manager" containerID="cri-o://680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c" gracePeriod=10 Apr 17 15:26:23.303002 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.302947 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:26:23.379591 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.379559 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zb6m\" (UniqueName: \"kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m\") pod \"02bb9519-102c-4721-aa4e-91b788176012\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " Apr 17 15:26:23.379746 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.379713 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume\") pod \"02bb9519-102c-4721-aa4e-91b788176012\" (UID: \"02bb9519-102c-4721-aa4e-91b788176012\") " Apr 17 15:26:23.380056 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.380026 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "02bb9519-102c-4721-aa4e-91b788176012" (UID: "02bb9519-102c-4721-aa4e-91b788176012"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:26:23.381657 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.381633 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m" (OuterVolumeSpecName: "kube-api-access-8zb6m") pod "02bb9519-102c-4721-aa4e-91b788176012" (UID: "02bb9519-102c-4721-aa4e-91b788176012"). InnerVolumeSpecName "kube-api-access-8zb6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:26:23.480361 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.480324 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02bb9519-102c-4721-aa4e-91b788176012-extensions-socket-volume\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:23.480361 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:23.480362 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zb6m\" (UniqueName: \"kubernetes.io/projected/02bb9519-102c-4721-aa4e-91b788176012-kube-api-access-8zb6m\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:24.091691 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.091651 2567 generic.go:358] "Generic (PLEG): container finished" podID="02bb9519-102c-4721-aa4e-91b788176012" containerID="680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c" exitCode=0 Apr 17 15:26:24.092066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.091751 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" Apr 17 15:26:24.092066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.091744 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" event={"ID":"02bb9519-102c-4721-aa4e-91b788176012","Type":"ContainerDied","Data":"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c"} Apr 17 15:26:24.092066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.091798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm" event={"ID":"02bb9519-102c-4721-aa4e-91b788176012","Type":"ContainerDied","Data":"05604f5794787efb82019bbc696d57e4fcfdfc3c5f736250e1b840c730f76514"} Apr 17 15:26:24.092066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.091819 2567 scope.go:117] "RemoveContainer" containerID="680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c" Apr 17 15:26:24.101433 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.101415 2567 scope.go:117] "RemoveContainer" containerID="680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c" Apr 17 15:26:24.101668 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:26:24.101649 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c\": container with ID starting with 680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c not found: ID does not exist" containerID="680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c" Apr 17 15:26:24.101734 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.101676 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c"} err="failed to get container status \"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c\": rpc error: code = NotFound desc = could not find container \"680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c\": container with ID starting with 680a57fb30d8abc42e7a2c9fdc36e0065f6039b32afabb5fb94caec734c9d38c not found: ID does not exist" Apr 17 15:26:24.114985 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.114956 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:26:24.118538 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:24.118515 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kckqm"] Apr 17 15:26:25.113010 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:25.112975 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bb9519-102c-4721-aa4e-91b788176012" path="/var/lib/kubelet/pods/02bb9519-102c-4721-aa4e-91b788176012/volumes" Apr 17 15:26:40.717945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.717899 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:40.718650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718607 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9e80a9a-382c-4794-a077-7b9b4a747f03" containerName="console" Apr 17 15:26:40.718650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718630 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e80a9a-382c-4794-a077-7b9b4a747f03" containerName="console" Apr 17 15:26:40.718650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718642 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02bb9519-102c-4721-aa4e-91b788176012" containerName="manager" Apr 17 15:26:40.718650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718651 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bb9519-102c-4721-aa4e-91b788176012" containerName="manager" Apr 17 15:26:40.718940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718669 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" containerName="manager" Apr 17 15:26:40.718940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718677 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" containerName="manager" Apr 17 15:26:40.718940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718806 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d10433d8-f189-4435-8e9b-882c1fbb9eca" containerName="manager" Apr 17 15:26:40.718940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718823 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9e80a9a-382c-4794-a077-7b9b4a747f03" containerName="console" Apr 17 15:26:40.718940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.718837 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="02bb9519-102c-4721-aa4e-91b788176012" containerName="manager" Apr 17 15:26:40.722032 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.722011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.724612 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.724580 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 15:26:40.724703 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.724681 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7w66g\"" Apr 17 15:26:40.730347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.730301 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:40.814387 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.814353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:40.832662 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.832628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqkb5\" (UniqueName: \"kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.832815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.832716 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.933812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.933776 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.934026 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.933865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqkb5\" (UniqueName: \"kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.934872 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.934846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:40.944422 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:40.944399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqkb5\" (UniqueName: \"kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5\") pod \"limitador-limitador-7d549b5b-586vl\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:41.033301 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:41.033220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:41.156865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:41.156842 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:41.159325 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:26:41.159287 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c004d9_b9e7_4965_b309_35068e89b4d8.slice/crio-2344938c36765735ab754394132f7ba51424e9e75805061384c7a778ee604a23 WatchSource:0}: Error finding container 2344938c36765735ab754394132f7ba51424e9e75805061384c7a778ee604a23: Status 404 returned error can't find the container with id 2344938c36765735ab754394132f7ba51424e9e75805061384c7a778ee604a23 Apr 17 15:26:42.161247 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:42.161203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" event={"ID":"63c004d9-b9e7-4965-b309-35068e89b4d8","Type":"ContainerStarted","Data":"2344938c36765735ab754394132f7ba51424e9e75805061384c7a778ee604a23"} Apr 17 15:26:44.170776 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:44.170741 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" event={"ID":"63c004d9-b9e7-4965-b309-35068e89b4d8","Type":"ContainerStarted","Data":"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d"} Apr 17 15:26:44.171210 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:44.170863 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:44.186135 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:44.186089 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" podStartSLOduration=1.54482599 podStartE2EDuration="4.186077004s" podCreationTimestamp="2026-04-17 15:26:40 +0000 UTC" firstStartedPulling="2026-04-17 15:26:41.16146723 +0000 UTC m=+574.553009243" lastFinishedPulling="2026-04-17 15:26:43.802718243 +0000 UTC m=+577.194260257" observedRunningTime="2026-04-17 15:26:44.184209166 +0000 UTC m=+577.575751213" watchObservedRunningTime="2026-04-17 15:26:44.186077004 +0000 UTC m=+577.577619039" Apr 17 15:26:55.175126 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:55.175097 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:56.874373 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:56.874335 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:56.874871 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:56.874613 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" podUID="63c004d9-b9e7-4965-b309-35068e89b4d8" containerName="limitador" containerID="cri-o://6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d" gracePeriod=30 Apr 17 15:26:57.411018 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.410996 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:57.469588 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.469514 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqkb5\" (UniqueName: \"kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5\") pod \"63c004d9-b9e7-4965-b309-35068e89b4d8\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " Apr 17 15:26:57.469726 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.469585 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file\") pod \"63c004d9-b9e7-4965-b309-35068e89b4d8\" (UID: \"63c004d9-b9e7-4965-b309-35068e89b4d8\") " Apr 17 15:26:57.469916 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.469896 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file" (OuterVolumeSpecName: "config-file") pod "63c004d9-b9e7-4965-b309-35068e89b4d8" (UID: "63c004d9-b9e7-4965-b309-35068e89b4d8"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:26:57.471619 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.471597 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5" (OuterVolumeSpecName: "kube-api-access-wqkb5") pod "63c004d9-b9e7-4965-b309-35068e89b4d8" (UID: "63c004d9-b9e7-4965-b309-35068e89b4d8"). InnerVolumeSpecName "kube-api-access-wqkb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:26:57.570100 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.570067 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqkb5\" (UniqueName: \"kubernetes.io/projected/63c004d9-b9e7-4965-b309-35068e89b4d8-kube-api-access-wqkb5\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:57.570100 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:57.570094 2567 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/63c004d9-b9e7-4965-b309-35068e89b4d8-config-file\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:26:58.224744 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.224709 2567 generic.go:358] "Generic (PLEG): container finished" podID="63c004d9-b9e7-4965-b309-35068e89b4d8" containerID="6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d" exitCode=0 Apr 17 15:26:58.225147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.224774 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" Apr 17 15:26:58.225147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.224792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" event={"ID":"63c004d9-b9e7-4965-b309-35068e89b4d8","Type":"ContainerDied","Data":"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d"} Apr 17 15:26:58.225147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.224832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-586vl" event={"ID":"63c004d9-b9e7-4965-b309-35068e89b4d8","Type":"ContainerDied","Data":"2344938c36765735ab754394132f7ba51424e9e75805061384c7a778ee604a23"} Apr 17 15:26:58.225147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.224853 2567 scope.go:117] "RemoveContainer" containerID="6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d" Apr 17 15:26:58.233501 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.233489 2567 scope.go:117] "RemoveContainer" containerID="6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d" Apr 17 15:26:58.233718 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:26:58.233698 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d\": container with ID starting with 6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d not found: ID does not exist" containerID="6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d" Apr 17 15:26:58.233764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.233726 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d"} err="failed to get container status \"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d\": rpc error: code = NotFound desc = could not find container \"6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d\": container with ID starting with 6e7dc3c3f17ea29350a6b509bde74ead43f4b8c7f1762cafb995beec6e27785d not found: ID does not exist" Apr 17 15:26:58.257859 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.257839 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:58.260044 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:58.260025 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-586vl"] Apr 17 15:26:59.113698 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:26:59.113662 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c004d9-b9e7-4965-b309-35068e89b4d8" path="/var/lib/kubelet/pods/63c004d9-b9e7-4965-b309-35068e89b4d8/volumes" Apr 17 15:27:02.222925 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.222891 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-wl7nz"] Apr 17 15:27:02.223289 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.223232 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63c004d9-b9e7-4965-b309-35068e89b4d8" containerName="limitador" Apr 17 15:27:02.223289 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.223245 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c004d9-b9e7-4965-b309-35068e89b4d8" containerName="limitador" Apr 17 15:27:02.223395 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.223327 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="63c004d9-b9e7-4965-b309-35068e89b4d8" containerName="limitador" Apr 17 15:27:02.227664 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.227643 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.229992 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.229969 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 15:27:02.230237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.230215 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-2f8f4\"" Apr 17 15:27:02.236341 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.236265 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wl7nz"] Apr 17 15:27:02.307725 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.307687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ab53a7e-e165-4e21-abe7-b7b332046d0a-data\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.307904 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.307748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48mh\" (UniqueName: \"kubernetes.io/projected/8ab53a7e-e165-4e21-abe7-b7b332046d0a-kube-api-access-h48mh\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.408265 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.408228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h48mh\" (UniqueName: \"kubernetes.io/projected/8ab53a7e-e165-4e21-abe7-b7b332046d0a-kube-api-access-h48mh\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.408456 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.408343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ab53a7e-e165-4e21-abe7-b7b332046d0a-data\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.408675 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.408659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ab53a7e-e165-4e21-abe7-b7b332046d0a-data\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.415996 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.415969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48mh\" (UniqueName: \"kubernetes.io/projected/8ab53a7e-e165-4e21-abe7-b7b332046d0a-kube-api-access-h48mh\") pod \"postgres-868db5846d-wl7nz\" (UID: \"8ab53a7e-e165-4e21-abe7-b7b332046d0a\") " pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.541985 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.541893 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:02.659486 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:02.659464 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wl7nz"] Apr 17 15:27:02.661054 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:02.661028 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab53a7e_e165_4e21_abe7_b7b332046d0a.slice/crio-bca49e2499c0714cfa722de411b0e974e1829689993c00ae891550d94495084d WatchSource:0}: Error finding container bca49e2499c0714cfa722de411b0e974e1829689993c00ae891550d94495084d: Status 404 returned error can't find the container with id bca49e2499c0714cfa722de411b0e974e1829689993c00ae891550d94495084d Apr 17 15:27:03.247141 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:03.247102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wl7nz" event={"ID":"8ab53a7e-e165-4e21-abe7-b7b332046d0a","Type":"ContainerStarted","Data":"bca49e2499c0714cfa722de411b0e974e1829689993c00ae891550d94495084d"} Apr 17 15:27:09.323391 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:09.323366 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 15:27:10.277393 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:10.277354 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wl7nz" event={"ID":"8ab53a7e-e165-4e21-abe7-b7b332046d0a","Type":"ContainerStarted","Data":"64ac932ebd6c7614d000c11d5d07adf9b637ad4e6a65f68db51d19ed78a5518d"} Apr 17 15:27:10.277585 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:10.277460 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:10.296039 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:10.295988 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-wl7nz" podStartSLOduration=1.637547047 podStartE2EDuration="8.295974047s" podCreationTimestamp="2026-04-17 15:27:02 +0000 UTC" firstStartedPulling="2026-04-17 15:27:02.662257567 +0000 UTC m=+596.053799580" lastFinishedPulling="2026-04-17 15:27:09.320684568 +0000 UTC m=+602.712226580" observedRunningTime="2026-04-17 15:27:10.293209888 +0000 UTC m=+603.684751922" watchObservedRunningTime="2026-04-17 15:27:10.295974047 +0000 UTC m=+603.687516082" Apr 17 15:27:16.309887 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:16.309849 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-wl7nz" Apr 17 15:27:17.100964 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.100927 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:17.112988 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.112954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.115886 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.115660 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 15:27:17.115886 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.115746 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 15:27:17.115886 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.115882 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:17.116129 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.115992 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-g4s5v\"" Apr 17 15:27:17.123385 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.123178 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:17.127372 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.127353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:17.129812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.129793 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-lw7qj\"" Apr 17 15:27:17.135782 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.135752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:17.232482 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.232453 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8cc\" (UniqueName: \"kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.232482 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.232487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7259\" (UniqueName: \"kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259\") pod \"maas-controller-77c949fc6b-vjvfn\" (UID: \"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077\") " pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:17.232694 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.232525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.333123 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.333088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8cc\" (UniqueName: \"kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.333617 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.333138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7259\" (UniqueName: \"kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259\") pod \"maas-controller-77c949fc6b-vjvfn\" (UID: \"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077\") " pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:17.333617 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.333194 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.333617 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:27:17.333372 2567 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 15:27:17.333617 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:27:17.333470 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls podName:69e352cc-5e0b-4651-957f-f9ae910f82cc nodeName:}" failed. No retries permitted until 2026-04-17 15:27:17.833452093 +0000 UTC m=+611.224994122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls") pod "maas-api-dc6fb7d79-2ftbv" (UID: "69e352cc-5e0b-4651-957f-f9ae910f82cc") : secret "maas-api-serving-cert" not found Apr 17 15:27:17.341636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.341614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8cc\" (UniqueName: \"kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.341759 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.341657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7259\" (UniqueName: \"kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259\") pod \"maas-controller-77c949fc6b-vjvfn\" (UID: \"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077\") " pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:17.440178 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.440141 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:17.569172 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.569143 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:17.571538 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:17.571504 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9124d6cc_b2c4_4f9d_9d1e_757f1fc49077.slice/crio-c52101bd23d8c53da5fd19df3cc31f9c3bf5f3a850a74007fef09bf1a7808105 WatchSource:0}: Error finding container c52101bd23d8c53da5fd19df3cc31f9c3bf5f3a850a74007fef09bf1a7808105: Status 404 returned error can't find the container with id c52101bd23d8c53da5fd19df3cc31f9c3bf5f3a850a74007fef09bf1a7808105 Apr 17 15:27:17.838288 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.838199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.840645 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.840620 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") pod \"maas-api-dc6fb7d79-2ftbv\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:17.919892 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.919851 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-698ccc456c-xjxql"] Apr 17 15:27:17.925527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.925510 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:17.929279 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:17.929255 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-698ccc456c-xjxql"] Apr 17 15:27:18.028485 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.028458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:18.040424 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.040396 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmpws\" (UniqueName: \"kubernetes.io/projected/08fa5197-d6b6-497f-bc14-9c60e27747f8-kube-api-access-mmpws\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.040571 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.040457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/08fa5197-d6b6-497f-bc14-9c60e27747f8-maas-api-tls\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.142266 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.141670 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmpws\" (UniqueName: \"kubernetes.io/projected/08fa5197-d6b6-497f-bc14-9c60e27747f8-kube-api-access-mmpws\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.142266 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.141764 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/08fa5197-d6b6-497f-bc14-9c60e27747f8-maas-api-tls\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.145090 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.145037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/08fa5197-d6b6-497f-bc14-9c60e27747f8-maas-api-tls\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.151237 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.151210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmpws\" (UniqueName: \"kubernetes.io/projected/08fa5197-d6b6-497f-bc14-9c60e27747f8-kube-api-access-mmpws\") pod \"maas-api-698ccc456c-xjxql\" (UID: \"08fa5197-d6b6-497f-bc14-9c60e27747f8\") " pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.171883 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.171855 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:18.173474 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:18.173444 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e352cc_5e0b_4651_957f_f9ae910f82cc.slice/crio-c8716e91cbb2ed0d8891cd37de96b963fb809b9ea7b3cc7fcf6c6228ffe0536d WatchSource:0}: Error finding container c8716e91cbb2ed0d8891cd37de96b963fb809b9ea7b3cc7fcf6c6228ffe0536d: Status 404 returned error can't find the container with id c8716e91cbb2ed0d8891cd37de96b963fb809b9ea7b3cc7fcf6c6228ffe0536d Apr 17 15:27:18.239089 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.239060 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:18.311192 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.311144 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" event={"ID":"69e352cc-5e0b-4651-957f-f9ae910f82cc","Type":"ContainerStarted","Data":"c8716e91cbb2ed0d8891cd37de96b963fb809b9ea7b3cc7fcf6c6228ffe0536d"} Apr 17 15:27:18.312481 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.312438 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" event={"ID":"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077","Type":"ContainerStarted","Data":"c52101bd23d8c53da5fd19df3cc31f9c3bf5f3a850a74007fef09bf1a7808105"} Apr 17 15:27:18.594144 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:18.594111 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-698ccc456c-xjxql"] Apr 17 15:27:18.595342 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:18.595292 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08fa5197_d6b6_497f_bc14_9c60e27747f8.slice/crio-c77bd85f366ea64ed0767ae6776521d12025bb63c597a78c6dfb3f99c10e8cf9 WatchSource:0}: Error finding container c77bd85f366ea64ed0767ae6776521d12025bb63c597a78c6dfb3f99c10e8cf9: Status 404 returned error can't find the container with id c77bd85f366ea64ed0767ae6776521d12025bb63c597a78c6dfb3f99c10e8cf9 Apr 17 15:27:19.319642 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:19.319588 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-698ccc456c-xjxql" event={"ID":"08fa5197-d6b6-497f-bc14-9c60e27747f8","Type":"ContainerStarted","Data":"c77bd85f366ea64ed0767ae6776521d12025bb63c597a78c6dfb3f99c10e8cf9"} Apr 17 15:27:21.329355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.329306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-698ccc456c-xjxql" event={"ID":"08fa5197-d6b6-497f-bc14-9c60e27747f8","Type":"ContainerStarted","Data":"35f36e010b14e7817c6ed1033bd4660effb4d881e1f0a05e7bf13d0130127537"} Apr 17 15:27:21.329796 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.329433 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:21.330759 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.330734 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" event={"ID":"69e352cc-5e0b-4651-957f-f9ae910f82cc","Type":"ContainerStarted","Data":"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320"} Apr 17 15:27:21.330909 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.330866 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:21.331993 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.331973 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" event={"ID":"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077","Type":"ContainerStarted","Data":"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a"} Apr 17 15:27:21.332090 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.332067 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:21.351234 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.351189 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-698ccc456c-xjxql" podStartSLOduration=2.54763173 podStartE2EDuration="4.351177233s" podCreationTimestamp="2026-04-17 15:27:17 +0000 UTC" firstStartedPulling="2026-04-17 15:27:18.597084116 +0000 UTC m=+611.988626131" lastFinishedPulling="2026-04-17 15:27:20.400629606 +0000 UTC m=+613.792171634" observedRunningTime="2026-04-17 15:27:21.349749085 +0000 UTC m=+614.741291120" watchObservedRunningTime="2026-04-17 15:27:21.351177233 +0000 UTC m=+614.742719245" Apr 17 15:27:21.366084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.366043 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" podStartSLOduration=1.541567328 podStartE2EDuration="4.366028125s" podCreationTimestamp="2026-04-17 15:27:17 +0000 UTC" firstStartedPulling="2026-04-17 15:27:17.573170387 +0000 UTC m=+610.964712412" lastFinishedPulling="2026-04-17 15:27:20.397631197 +0000 UTC m=+613.789173209" observedRunningTime="2026-04-17 15:27:21.3651732 +0000 UTC m=+614.756715236" watchObservedRunningTime="2026-04-17 15:27:21.366028125 +0000 UTC m=+614.757570165" Apr 17 15:27:21.382414 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:21.382374 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" podStartSLOduration=2.159252821 podStartE2EDuration="4.382362462s" podCreationTimestamp="2026-04-17 15:27:17 +0000 UTC" firstStartedPulling="2026-04-17 15:27:18.174971514 +0000 UTC m=+611.566513528" lastFinishedPulling="2026-04-17 15:27:20.398081157 +0000 UTC m=+613.789623169" observedRunningTime="2026-04-17 15:27:21.380471179 +0000 UTC m=+614.772013214" watchObservedRunningTime="2026-04-17 15:27:21.382362462 +0000 UTC m=+614.773904498" Apr 17 15:27:27.341140 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.341062 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:27.343654 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.341567 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-698ccc456c-xjxql" Apr 17 15:27:27.395758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.395728 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:27.395966 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.395925 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" podUID="69e352cc-5e0b-4651-957f-f9ae910f82cc" containerName="maas-api" containerID="cri-o://c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320" gracePeriod=30 Apr 17 15:27:27.636594 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.636572 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:27.725883 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.725852 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8cc\" (UniqueName: \"kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc\") pod \"69e352cc-5e0b-4651-957f-f9ae910f82cc\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " Apr 17 15:27:27.726037 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.725895 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") pod \"69e352cc-5e0b-4651-957f-f9ae910f82cc\" (UID: \"69e352cc-5e0b-4651-957f-f9ae910f82cc\") " Apr 17 15:27:27.728091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.728063 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc" (OuterVolumeSpecName: "kube-api-access-ws8cc") pod "69e352cc-5e0b-4651-957f-f9ae910f82cc" (UID: "69e352cc-5e0b-4651-957f-f9ae910f82cc"). InnerVolumeSpecName "kube-api-access-ws8cc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:27:27.728199 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.728102 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "69e352cc-5e0b-4651-957f-f9ae910f82cc" (UID: "69e352cc-5e0b-4651-957f-f9ae910f82cc"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:27:27.826670 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.826636 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8cc\" (UniqueName: \"kubernetes.io/projected/69e352cc-5e0b-4651-957f-f9ae910f82cc-kube-api-access-ws8cc\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:27:27.826670 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:27.826667 2567 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/69e352cc-5e0b-4651-957f-f9ae910f82cc-maas-api-tls\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:27:28.363371 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.363332 2567 generic.go:358] "Generic (PLEG): container finished" podID="69e352cc-5e0b-4651-957f-f9ae910f82cc" containerID="c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320" exitCode=0 Apr 17 15:27:28.363763 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.363393 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" Apr 17 15:27:28.363763 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.363392 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" event={"ID":"69e352cc-5e0b-4651-957f-f9ae910f82cc","Type":"ContainerDied","Data":"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320"} Apr 17 15:27:28.363763 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.363494 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-dc6fb7d79-2ftbv" event={"ID":"69e352cc-5e0b-4651-957f-f9ae910f82cc","Type":"ContainerDied","Data":"c8716e91cbb2ed0d8891cd37de96b963fb809b9ea7b3cc7fcf6c6228ffe0536d"} Apr 17 15:27:28.363763 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.363510 2567 scope.go:117] "RemoveContainer" containerID="c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320" Apr 17 15:27:28.372463 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.372433 2567 scope.go:117] "RemoveContainer" containerID="c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320" Apr 17 15:27:28.372674 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:27:28.372656 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320\": container with ID starting with c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320 not found: ID does not exist" containerID="c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320" Apr 17 15:27:28.372714 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.372684 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320"} err="failed to get container status \"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320\": rpc error: code = NotFound desc = could not find container \"c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320\": container with ID starting with c5192865aebdbbf81f9abb84e58e1db03fdf3cd31543297af07feb16fe562320 not found: ID does not exist" Apr 17 15:27:28.384117 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.384093 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:28.387665 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:28.387646 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-dc6fb7d79-2ftbv"] Apr 17 15:27:29.113457 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:29.113420 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e352cc-5e0b-4651-957f-f9ae910f82cc" path="/var/lib/kubelet/pods/69e352cc-5e0b-4651-957f-f9ae910f82cc/volumes" Apr 17 15:27:32.340672 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.340641 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:32.637578 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.637546 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-796df7bcdb-64phj"] Apr 17 15:27:32.637958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.637945 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69e352cc-5e0b-4651-957f-f9ae910f82cc" containerName="maas-api" Apr 17 15:27:32.638007 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.637960 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e352cc-5e0b-4651-957f-f9ae910f82cc" containerName="maas-api" Apr 17 15:27:32.638043 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.638031 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="69e352cc-5e0b-4651-957f-f9ae910f82cc" containerName="maas-api" Apr 17 15:27:32.642301 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.642286 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:32.646583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.646561 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-796df7bcdb-64phj"] Apr 17 15:27:32.768694 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.768663 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g52pm\" (UniqueName: \"kubernetes.io/projected/c2e5acae-41c9-4108-8435-f58254d5ce79-kube-api-access-g52pm\") pod \"maas-controller-796df7bcdb-64phj\" (UID: \"c2e5acae-41c9-4108-8435-f58254d5ce79\") " pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:32.869451 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.869407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g52pm\" (UniqueName: \"kubernetes.io/projected/c2e5acae-41c9-4108-8435-f58254d5ce79-kube-api-access-g52pm\") pod \"maas-controller-796df7bcdb-64phj\" (UID: \"c2e5acae-41c9-4108-8435-f58254d5ce79\") " pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:32.877894 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.877866 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g52pm\" (UniqueName: \"kubernetes.io/projected/c2e5acae-41c9-4108-8435-f58254d5ce79-kube-api-access-g52pm\") pod \"maas-controller-796df7bcdb-64phj\" (UID: \"c2e5acae-41c9-4108-8435-f58254d5ce79\") " pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:32.955888 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:32.955821 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:33.077323 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:33.077268 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-796df7bcdb-64phj"] Apr 17 15:27:33.079782 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:33.079755 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e5acae_41c9_4108_8435_f58254d5ce79.slice/crio-4993a7ec2d52b70a3eaa6753a886df5ee84f05d4a92acaa344bdadb0a37ff59d WatchSource:0}: Error finding container 4993a7ec2d52b70a3eaa6753a886df5ee84f05d4a92acaa344bdadb0a37ff59d: Status 404 returned error can't find the container with id 4993a7ec2d52b70a3eaa6753a886df5ee84f05d4a92acaa344bdadb0a37ff59d Apr 17 15:27:33.081022 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:33.081007 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:27:33.384443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:33.384406 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-796df7bcdb-64phj" event={"ID":"c2e5acae-41c9-4108-8435-f58254d5ce79","Type":"ContainerStarted","Data":"4993a7ec2d52b70a3eaa6753a886df5ee84f05d4a92acaa344bdadb0a37ff59d"} Apr 17 15:27:34.390012 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:34.389975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-796df7bcdb-64phj" event={"ID":"c2e5acae-41c9-4108-8435-f58254d5ce79","Type":"ContainerStarted","Data":"cca8b2919efaf894cee7708d68fb618bc6b56384211ffa7105619639306c9060"} Apr 17 15:27:34.390416 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:34.390053 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:34.408025 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:34.407978 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-796df7bcdb-64phj" podStartSLOduration=2.068930658 podStartE2EDuration="2.407964056s" podCreationTimestamp="2026-04-17 15:27:32 +0000 UTC" firstStartedPulling="2026-04-17 15:27:33.08112471 +0000 UTC m=+626.472666723" lastFinishedPulling="2026-04-17 15:27:33.420158108 +0000 UTC m=+626.811700121" observedRunningTime="2026-04-17 15:27:34.404741665 +0000 UTC m=+627.796283713" watchObservedRunningTime="2026-04-17 15:27:34.407964056 +0000 UTC m=+627.799506092" Apr 17 15:27:45.399725 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.399691 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-796df7bcdb-64phj" Apr 17 15:27:45.440786 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.440757 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:45.440995 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.440974 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" podUID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" containerName="manager" containerID="cri-o://a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a" gracePeriod=10 Apr 17 15:27:45.678232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.678210 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:45.783083 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.783050 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7259\" (UniqueName: \"kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259\") pod \"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077\" (UID: \"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077\") " Apr 17 15:27:45.785088 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.785050 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259" (OuterVolumeSpecName: "kube-api-access-t7259") pod "9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" (UID: "9124d6cc-b2c4-4f9d-9d1e-757f1fc49077"). InnerVolumeSpecName "kube-api-access-t7259". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:27:45.883830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:45.883799 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7259\" (UniqueName: \"kubernetes.io/projected/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077-kube-api-access-t7259\") on node \"ip-10-0-130-92.ec2.internal\" DevicePath \"\"" Apr 17 15:27:46.435794 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.435757 2567 generic.go:358] "Generic (PLEG): container finished" podID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" containerID="a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a" exitCode=0 Apr 17 15:27:46.436241 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.435818 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" Apr 17 15:27:46.436241 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.435846 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" event={"ID":"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077","Type":"ContainerDied","Data":"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a"} Apr 17 15:27:46.436241 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.435882 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c949fc6b-vjvfn" event={"ID":"9124d6cc-b2c4-4f9d-9d1e-757f1fc49077","Type":"ContainerDied","Data":"c52101bd23d8c53da5fd19df3cc31f9c3bf5f3a850a74007fef09bf1a7808105"} Apr 17 15:27:46.436241 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.435897 2567 scope.go:117] "RemoveContainer" containerID="a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a" Apr 17 15:27:46.445808 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.445637 2567 scope.go:117] "RemoveContainer" containerID="a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a" Apr 17 15:27:46.445907 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:27:46.445889 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a\": container with ID starting with a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a not found: ID does not exist" containerID="a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a" Apr 17 15:27:46.445948 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.445915 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a"} err="failed to get container status \"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a\": rpc error: code = NotFound desc = could not find container \"a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a\": container with ID starting with a0a85b904ab605ff0cd4a6dcbfe9af7496be559c652d94c491f258433295d41a not found: ID does not exist" Apr 17 15:27:46.460543 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.460519 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:46.468846 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:46.468825 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-77c949fc6b-vjvfn"] Apr 17 15:27:47.116375 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:47.116339 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" path="/var/lib/kubelet/pods/9124d6cc-b2c4-4f9d-9d1e-757f1fc49077/volumes" Apr 17 15:27:55.858667 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.858628 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr"] Apr 17 15:27:55.860084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.860052 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" containerName="manager" Apr 17 15:27:55.860257 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.860244 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" containerName="manager" Apr 17 15:27:55.860550 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.860535 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9124d6cc-b2c4-4f9d-9d1e-757f1fc49077" containerName="manager" Apr 17 15:27:55.869347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.869299 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.872140 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.872118 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 15:27:55.872338 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.872215 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 15:27:55.873192 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.873171 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr"] Apr 17 15:27:55.873376 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.873361 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-xpqz8\"" Apr 17 15:27:55.873466 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.873376 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 15:27:55.964695 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.964844 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.964844 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wz84\" (UniqueName: \"kubernetes.io/projected/73ad47c6-e149-436f-b405-396b949ce55e-kube-api-access-2wz84\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.964983 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.964983 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:55.964983 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:55.964970 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73ad47c6-e149-436f-b405-396b949ce55e-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.065985 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.065953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wz84\" (UniqueName: \"kubernetes.io/projected/73ad47c6-e149-436f-b405-396b949ce55e-kube-api-access-2wz84\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066151 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066151 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066151 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73ad47c6-e149-436f-b405-396b949ce55e-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066151 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066069 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066151 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066519 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066646 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.066724 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.066666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.068512 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.068493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73ad47c6-e149-436f-b405-396b949ce55e-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.068680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.068664 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73ad47c6-e149-436f-b405-396b949ce55e-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.073840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.073816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wz84\" (UniqueName: \"kubernetes.io/projected/73ad47c6-e149-436f-b405-396b949ce55e-kube-api-access-2wz84\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-z7jcr\" (UID: \"73ad47c6-e149-436f-b405-396b949ce55e\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.181040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.181013 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:27:56.313990 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.313946 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr"] Apr 17 15:27:56.317229 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:27:56.317202 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ad47c6_e149_436f_b405_396b949ce55e.slice/crio-68c7a8148bb6a6ed5925422072b5f9bf7e4d94dda2b57beeced1b2bf5443d659 WatchSource:0}: Error finding container 68c7a8148bb6a6ed5925422072b5f9bf7e4d94dda2b57beeced1b2bf5443d659: Status 404 returned error can't find the container with id 68c7a8148bb6a6ed5925422072b5f9bf7e4d94dda2b57beeced1b2bf5443d659 Apr 17 15:27:56.476366 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:27:56.476268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerStarted","Data":"68c7a8148bb6a6ed5925422072b5f9bf7e4d94dda2b57beeced1b2bf5443d659"} Apr 17 15:28:04.512385 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:04.512347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerStarted","Data":"1c3b86174c73b2b1085e6872ae08b62d1c220438f07f1d2cb3f3f484833e560b"} Apr 17 15:28:11.545256 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:11.545216 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="1c3b86174c73b2b1085e6872ae08b62d1c220438f07f1d2cb3f3f484833e560b" exitCode=0 Apr 17 15:28:11.545569 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:11.545265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"1c3b86174c73b2b1085e6872ae08b62d1c220438f07f1d2cb3f3f484833e560b"} Apr 17 15:28:13.561006 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:13.560974 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/0.log" Apr 17 15:28:13.561464 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:13.561305 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="8b6693011e9bd85bbf2a79b7efde338c7033d0b794dcb81703b11bc9742c182a" exitCode=2 Apr 17 15:28:13.561464 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:13.561383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"8b6693011e9bd85bbf2a79b7efde338c7033d0b794dcb81703b11bc9742c182a"} Apr 17 15:28:13.561824 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:13.561807 2567 scope.go:117] "RemoveContainer" containerID="8b6693011e9bd85bbf2a79b7efde338c7033d0b794dcb81703b11bc9742c182a" Apr 17 15:28:14.567517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.567488 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/1.log" Apr 17 15:28:14.567958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.567947 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/0.log" Apr 17 15:28:14.568334 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.568279 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872" exitCode=2 Apr 17 15:28:14.568473 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.568346 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872"} Apr 17 15:28:14.568473 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.568388 2567 scope.go:117] "RemoveContainer" containerID="8b6693011e9bd85bbf2a79b7efde338c7033d0b794dcb81703b11bc9742c182a" Apr 17 15:28:14.568824 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:14.568807 2567 scope.go:117] "RemoveContainer" containerID="14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872" Apr 17 15:28:14.569024 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:14.569005 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:28:15.573971 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:15.573948 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/1.log" Apr 17 15:28:16.181255 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:16.181223 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:28:16.181473 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:16.181267 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:28:16.181754 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:16.181735 2567 scope.go:117] "RemoveContainer" containerID="14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872" Apr 17 15:28:16.181996 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:16.181975 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:28:18.069735 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.069705 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j"] Apr 17 15:28:18.077424 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.077402 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.080611 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.080589 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 15:28:18.088162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.088138 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j"] Apr 17 15:28:18.174862 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.174819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.175048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.174914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.175048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.174956 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.175048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.174978 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97964fda-6e6e-42d3-880f-e40d7b19cc02-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.175048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.175001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22f8\" (UniqueName: \"kubernetes.io/projected/97964fda-6e6e-42d3-880f-e40d7b19cc02-kube-api-access-t22f8\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.175048 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.175041 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.275990 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.275951 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276161 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.275998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276161 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97964fda-6e6e-42d3-880f-e40d7b19cc02-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276161 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t22f8\" (UniqueName: \"kubernetes.io/projected/97964fda-6e6e-42d3-880f-e40d7b19cc02-kube-api-access-t22f8\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276265 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276265 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276526 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276503 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276640 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.276640 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.276635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.278855 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.278825 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97964fda-6e6e-42d3-880f-e40d7b19cc02-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.279052 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.279036 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97964fda-6e6e-42d3-880f-e40d7b19cc02-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.301942 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.301911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22f8\" (UniqueName: \"kubernetes.io/projected/97964fda-6e6e-42d3-880f-e40d7b19cc02-kube-api-access-t22f8\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-5z84j\" (UID: \"97964fda-6e6e-42d3-880f-e40d7b19cc02\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.389195 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.389170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:18.525875 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.525843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j"] Apr 17 15:28:18.529517 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:28:18.529487 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97964fda_6e6e_42d3_880f_e40d7b19cc02.slice/crio-997d532ffdbcc95c4056baf17296573271a2834ee682027e11c2fe02681e3137 WatchSource:0}: Error finding container 997d532ffdbcc95c4056baf17296573271a2834ee682027e11c2fe02681e3137: Status 404 returned error can't find the container with id 997d532ffdbcc95c4056baf17296573271a2834ee682027e11c2fe02681e3137 Apr 17 15:28:18.586370 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:18.586338 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerStarted","Data":"997d532ffdbcc95c4056baf17296573271a2834ee682027e11c2fe02681e3137"} Apr 17 15:28:19.591620 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:19.591584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerStarted","Data":"bf858de10d7b101d31e3911784b7ecd3c2819a1b787405764894a9c92ce03319"} Apr 17 15:28:22.353081 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.353045 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl"] Apr 17 15:28:22.380135 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.380100 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl"] Apr 17 15:28:22.380297 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.380207 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.384421 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.384399 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 15:28:22.515950 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.515908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf44p\" (UniqueName: \"kubernetes.io/projected/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kube-api-access-gf44p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.516146 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.515966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe3fc1b-c251-430d-8d5a-2011e0193f81-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.516146 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.516032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.516146 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.516106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.516146 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.516133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.516341 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.516175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617356 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617356 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf44p\" (UniqueName: \"kubernetes.io/projected/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kube-api-access-gf44p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617583 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe3fc1b-c251-430d-8d5a-2011e0193f81-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617708 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.617840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.617810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.620139 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.620112 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2fe3fc1b-c251-430d-8d5a-2011e0193f81-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.620610 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.620587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe3fc1b-c251-430d-8d5a-2011e0193f81-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.632741 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.632713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf44p\" (UniqueName: \"kubernetes.io/projected/2fe3fc1b-c251-430d-8d5a-2011e0193f81-kube-api-access-gf44p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl\" (UID: \"2fe3fc1b-c251-430d-8d5a-2011e0193f81\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.695922 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.695890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:22.846285 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:22.846256 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl"] Apr 17 15:28:22.847676 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:28:22.847650 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe3fc1b_c251_430d_8d5a_2011e0193f81.slice/crio-19f2e770f688ab28a461d3ec09401cd0f766f13b3d23506a12e4e05f88fc3ebe WatchSource:0}: Error finding container 19f2e770f688ab28a461d3ec09401cd0f766f13b3d23506a12e4e05f88fc3ebe: Status 404 returned error can't find the container with id 19f2e770f688ab28a461d3ec09401cd0f766f13b3d23506a12e4e05f88fc3ebe Apr 17 15:28:23.615714 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:23.615672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerStarted","Data":"5024b6bdc834e68466f59e39bcaf9edaca4b30d6ad624b36c034f599978cd82f"} Apr 17 15:28:23.615714 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:23.615717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerStarted","Data":"19f2e770f688ab28a461d3ec09401cd0f766f13b3d23506a12e4e05f88fc3ebe"} Apr 17 15:28:24.621797 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:24.621763 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="bf858de10d7b101d31e3911784b7ecd3c2819a1b787405764894a9c92ce03319" exitCode=0 Apr 17 15:28:24.622218 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:24.621831 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"bf858de10d7b101d31e3911784b7ecd3c2819a1b787405764894a9c92ce03319"} Apr 17 15:28:25.627931 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:25.627904 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/0.log" Apr 17 15:28:25.628359 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:25.628206 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="e563ba1796286c7e8d16fb07e6f9ea6b4de8876f73a56c29dabb12a21c09998b" exitCode=2 Apr 17 15:28:25.628359 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:25.628263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"e563ba1796286c7e8d16fb07e6f9ea6b4de8876f73a56c29dabb12a21c09998b"} Apr 17 15:28:25.628682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:25.628663 2567 scope.go:117] "RemoveContainer" containerID="e563ba1796286c7e8d16fb07e6f9ea6b4de8876f73a56c29dabb12a21c09998b" Apr 17 15:28:26.634452 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.634424 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/1.log" Apr 17 15:28:26.634904 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.634887 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/0.log" Apr 17 15:28:26.635227 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.635207 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25" exitCode=2 Apr 17 15:28:26.635306 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.635286 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25"} Apr 17 15:28:26.635359 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.635347 2567 scope.go:117] "RemoveContainer" containerID="e563ba1796286c7e8d16fb07e6f9ea6b4de8876f73a56c29dabb12a21c09998b" Apr 17 15:28:26.635859 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:26.635832 2567 scope.go:117] "RemoveContainer" containerID="6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25" Apr 17 15:28:26.636079 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:26.636061 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:28:27.641028 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:27.640999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/1.log" Apr 17 15:28:28.389475 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:28.389447 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:28.389475 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:28.389478 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:28.389864 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:28.389849 2567 scope.go:117] "RemoveContainer" containerID="6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25" Apr 17 15:28:28.390038 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:28.390023 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:28:28.646857 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:28.646823 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="5024b6bdc834e68466f59e39bcaf9edaca4b30d6ad624b36c034f599978cd82f" exitCode=0 Apr 17 15:28:28.647343 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:28.646902 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"5024b6bdc834e68466f59e39bcaf9edaca4b30d6ad624b36c034f599978cd82f"} Apr 17 15:28:29.110231 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.109237 2567 scope.go:117] "RemoveContainer" containerID="14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872" Apr 17 15:28:29.652009 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.651986 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/0.log" Apr 17 15:28:29.652403 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.652383 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="7ca493e6f4e65f7a7da15ea00528f18035a3a1c756f72f5a62905792923ee137" exitCode=2 Apr 17 15:28:29.652487 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.652467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"7ca493e6f4e65f7a7da15ea00528f18035a3a1c756f72f5a62905792923ee137"} Apr 17 15:28:29.652913 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.652893 2567 scope.go:117] "RemoveContainer" containerID="7ca493e6f4e65f7a7da15ea00528f18035a3a1c756f72f5a62905792923ee137" Apr 17 15:28:29.654386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.654364 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/2.log" Apr 17 15:28:29.654790 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.654766 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/1.log" Apr 17 15:28:29.655144 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.655124 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" exitCode=2 Apr 17 15:28:29.655240 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.655163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069"} Apr 17 15:28:29.655240 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.655191 2567 scope.go:117] "RemoveContainer" containerID="14aa574574c0a9f6c64b7299aa6f1a6cd5ee14125ef7c56e0959cac796560872" Apr 17 15:28:29.655696 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:29.655680 2567 scope.go:117] "RemoveContainer" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" Apr 17 15:28:29.655928 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:29.655900 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:28:30.660727 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.660700 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/1.log" Apr 17 15:28:30.661131 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.661053 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/0.log" Apr 17 15:28:30.661381 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.661363 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b" exitCode=2 Apr 17 15:28:30.661447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.661428 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b"} Apr 17 15:28:30.661510 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.661462 2567 scope.go:117] "RemoveContainer" containerID="7ca493e6f4e65f7a7da15ea00528f18035a3a1c756f72f5a62905792923ee137" Apr 17 15:28:30.661896 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.661872 2567 scope.go:117] "RemoveContainer" containerID="a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b" Apr 17 15:28:30.662160 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:30.662138 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:28:30.662915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:30.662895 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/2.log" Apr 17 15:28:31.669243 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:31.669215 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/1.log" Apr 17 15:28:32.696872 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:32.696838 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:32.696872 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:32.696872 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:32.697272 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:32.697227 2567 scope.go:117] "RemoveContainer" containerID="a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b" Apr 17 15:28:32.697452 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:32.697433 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:28:36.181530 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:36.181497 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:28:36.181530 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:36.181537 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:28:36.181978 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:36.181895 2567 scope.go:117] "RemoveContainer" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" Apr 17 15:28:36.182077 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:36.182058 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:28:40.471552 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.471517 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h"] Apr 17 15:28:40.485891 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.485861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.487362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.487337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h"] Apr 17 15:28:40.489997 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.489975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 15:28:40.577845 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.577809 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.578035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.577850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.578035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.577913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.578035 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.577934 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/883d2618-0022-4c27-a907-543f886010a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.578166 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.578030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5fzc\" (UniqueName: \"kubernetes.io/projected/883d2618-0022-4c27-a907-543f886010a9-kube-api-access-p5fzc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.578166 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.578092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679517 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/883d2618-0022-4c27-a907-543f886010a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fzc\" (UniqueName: \"kubernetes.io/projected/883d2618-0022-4c27-a907-543f886010a9-kube-api-access-p5fzc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679764 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.679984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679953 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.680031 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.679990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.680069 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.680032 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.681806 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.681782 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/883d2618-0022-4c27-a907-543f886010a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.682072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.682053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/883d2618-0022-4c27-a907-543f886010a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.691623 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.691603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fzc\" (UniqueName: \"kubernetes.io/projected/883d2618-0022-4c27-a907-543f886010a9-kube-api-access-p5fzc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h\" (UID: \"883d2618-0022-4c27-a907-543f886010a9\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.796479 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.796393 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:40.938201 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:40.938178 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h"] Apr 17 15:28:40.939387 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:28:40.939360 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883d2618_0022_4c27_a907_543f886010a9.slice/crio-c5d20edf909a9457e891bf88ec8339b5245f93b2ce18a1d8098812a7551af777 WatchSource:0}: Error finding container c5d20edf909a9457e891bf88ec8339b5245f93b2ce18a1d8098812a7551af777: Status 404 returned error can't find the container with id c5d20edf909a9457e891bf88ec8339b5245f93b2ce18a1d8098812a7551af777 Apr 17 15:28:41.712114 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:41.712076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerStarted","Data":"32f2e65f21e95033bb4052bf6698cdc28d01946b1ad928ec16f0231b6e395a8d"} Apr 17 15:28:41.712644 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:41.712121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerStarted","Data":"c5d20edf909a9457e891bf88ec8339b5245f93b2ce18a1d8098812a7551af777"} Apr 17 15:28:42.109768 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.109683 2567 scope.go:117] "RemoveContainer" containerID="6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25" Apr 17 15:28:42.718067 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.717984 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/2.log" Apr 17 15:28:42.718502 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.718460 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/1.log" Apr 17 15:28:42.718797 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.718771 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" exitCode=2 Apr 17 15:28:42.718867 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.718844 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5"} Apr 17 15:28:42.718915 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.718881 2567 scope.go:117] "RemoveContainer" containerID="6c720a0c34eb6960fb5feb8915db02bce7d7420d08734b6447d1016051db5a25" Apr 17 15:28:42.719447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:42.719429 2567 scope.go:117] "RemoveContainer" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" Apr 17 15:28:42.719658 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:42.719638 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:28:43.724666 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:43.724636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/2.log" Apr 17 15:28:46.109812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.109780 2567 scope.go:117] "RemoveContainer" containerID="a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b" Apr 17 15:28:46.738427 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.738400 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="32f2e65f21e95033bb4052bf6698cdc28d01946b1ad928ec16f0231b6e395a8d" exitCode=0 Apr 17 15:28:46.738608 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.738479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"32f2e65f21e95033bb4052bf6698cdc28d01946b1ad928ec16f0231b6e395a8d"} Apr 17 15:28:46.739983 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.739960 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/2.log" Apr 17 15:28:46.740424 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.740358 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/1.log" Apr 17 15:28:46.740695 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.740676 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c" exitCode=2 Apr 17 15:28:46.740755 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.740714 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c"} Apr 17 15:28:46.740755 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.740736 2567 scope.go:117] "RemoveContainer" containerID="a68c956851e27a2ea010d8be5f46410ca49c9a048057c5734777922914e28f5b" Apr 17 15:28:46.741084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:46.741068 2567 scope.go:117] "RemoveContainer" containerID="89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c" Apr 17 15:28:46.741264 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:46.741246 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:28:47.113559 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.113526 2567 scope.go:117] "RemoveContainer" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" Apr 17 15:28:47.114025 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:47.113790 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:28:47.746162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.746130 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/0.log" Apr 17 15:28:47.746528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.746502 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="a5aeb0ca439b3a05be3191b97096da7bd0b4d4dec70320cccd430ce75106e388" exitCode=2 Apr 17 15:28:47.746616 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.746568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"a5aeb0ca439b3a05be3191b97096da7bd0b4d4dec70320cccd430ce75106e388"} Apr 17 15:28:47.746971 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.746945 2567 scope.go:117] "RemoveContainer" containerID="a5aeb0ca439b3a05be3191b97096da7bd0b4d4dec70320cccd430ce75106e388" Apr 17 15:28:47.748169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:47.748152 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/2.log" Apr 17 15:28:48.389944 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.389915 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:48.390344 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.389954 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:28:48.390397 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.390382 2567 scope.go:117] "RemoveContainer" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" Apr 17 15:28:48.390594 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:48.390571 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:28:48.760147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.760069 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/1.log" Apr 17 15:28:48.760480 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.760466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/0.log" Apr 17 15:28:48.760807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.760785 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b" exitCode=2 Apr 17 15:28:48.760881 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.760858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b"} Apr 17 15:28:48.760916 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.760904 2567 scope.go:117] "RemoveContainer" containerID="a5aeb0ca439b3a05be3191b97096da7bd0b4d4dec70320cccd430ce75106e388" Apr 17 15:28:48.761273 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:48.761258 2567 scope.go:117] "RemoveContainer" containerID="d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b" Apr 17 15:28:48.761503 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:48.761482 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:28:49.766281 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:49.766254 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/1.log" Apr 17 15:28:50.796650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:50.796618 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:50.796650 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:50.796649 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:28:50.797171 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:50.797039 2567 scope.go:117] "RemoveContainer" containerID="d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b" Apr 17 15:28:50.797240 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:50.797223 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:28:52.697036 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:52.697004 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:52.697036 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:52.697040 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:28:52.697479 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:52.697454 2567 scope.go:117] "RemoveContainer" containerID="89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c" Apr 17 15:28:52.697648 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:52.697630 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:28:54.054841 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.054806 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n"] Apr 17 15:28:54.058119 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.058093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.061321 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.061288 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 15:28:54.069164 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.069144 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n"] Apr 17 15:28:54.202558 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202529 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6xz\" (UniqueName: \"kubernetes.io/projected/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kube-api-access-4b6xz\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.202729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/978ad55c-6893-4220-bff1-4ea0e2bc0a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.202729 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202653 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.202841 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.202906 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202859 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.202906 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.202893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.303855 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.303806 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.303867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.303915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6xz\" (UniqueName: \"kubernetes.io/projected/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kube-api-access-4b6xz\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.303983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/978ad55c-6893-4220-bff1-4ea0e2bc0a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.304014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304263 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.304048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.304256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.304282 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.304492 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.304435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.306945 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.306867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/978ad55c-6893-4220-bff1-4ea0e2bc0a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.307142 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.307117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/978ad55c-6893-4220-bff1-4ea0e2bc0a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.312757 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.312730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6xz\" (UniqueName: \"kubernetes.io/projected/978ad55c-6893-4220-bff1-4ea0e2bc0a89-kube-api-access-4b6xz\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-g559n\" (UID: \"978ad55c-6893-4220-bff1-4ea0e2bc0a89\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.368858 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.368827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:28:54.497029 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.496978 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n"] Apr 17 15:28:54.499366 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:28:54.499338 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod978ad55c_6893_4220_bff1_4ea0e2bc0a89.slice/crio-84f1dd383300d2210d913c8da68eca62ee3a968f8d0704674c3aea9ea84eb07d WatchSource:0}: Error finding container 84f1dd383300d2210d913c8da68eca62ee3a968f8d0704674c3aea9ea84eb07d: Status 404 returned error can't find the container with id 84f1dd383300d2210d913c8da68eca62ee3a968f8d0704674c3aea9ea84eb07d Apr 17 15:28:54.794347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.794295 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" event={"ID":"978ad55c-6893-4220-bff1-4ea0e2bc0a89","Type":"ContainerStarted","Data":"daa25d95aea8105ec9f2cccc570371b7d5fa7aa97093e352c56715f0a90e79ba"} Apr 17 15:28:54.794347 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:54.794348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" event={"ID":"978ad55c-6893-4220-bff1-4ea0e2bc0a89","Type":"ContainerStarted","Data":"84f1dd383300d2210d913c8da68eca62ee3a968f8d0704674c3aea9ea84eb07d"} Apr 17 15:28:58.350636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.350551 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x"] Apr 17 15:28:58.353125 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.353108 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.355598 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.355576 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 15:28:58.363508 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.363485 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x"] Apr 17 15:28:58.457589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457563 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.457752 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4f1d7-a3b6-46de-922e-f92af4e43388-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.457752 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpb52\" (UniqueName: \"kubernetes.io/projected/eae4f1d7-a3b6-46de-922e-f92af4e43388-kube-api-access-tpb52\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.457752 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.457860 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.457860 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.457804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558414 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558474 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558613 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4f1d7-a3b6-46de-922e-f92af4e43388-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558905 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpb52\" (UniqueName: \"kubernetes.io/projected/eae4f1d7-a3b6-46de-922e-f92af4e43388-kube-api-access-tpb52\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.558905 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.559038 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.558984 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.559093 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.559035 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.560755 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.560727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eae4f1d7-a3b6-46de-922e-f92af4e43388-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.561185 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.561168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eae4f1d7-a3b6-46de-922e-f92af4e43388-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.566398 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.566376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpb52\" (UniqueName: \"kubernetes.io/projected/eae4f1d7-a3b6-46de-922e-f92af4e43388-kube-api-access-tpb52\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x\" (UID: \"eae4f1d7-a3b6-46de-922e-f92af4e43388\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.665417 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.665382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:28:58.808107 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:58.807977 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x"] Apr 17 15:28:58.812890 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:28:58.812830 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeae4f1d7_a3b6_46de_922e_f92af4e43388.slice/crio-a9bd543e724bed253257b677535b4defecc705c744731f93f32a2de45427dad4 WatchSource:0}: Error finding container a9bd543e724bed253257b677535b4defecc705c744731f93f32a2de45427dad4: Status 404 returned error can't find the container with id a9bd543e724bed253257b677535b4defecc705c744731f93f32a2de45427dad4 Apr 17 15:28:59.110538 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:59.109854 2567 scope.go:117] "RemoveContainer" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" Apr 17 15:28:59.110538 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:28:59.110110 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:28:59.820609 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:59.820573 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerStarted","Data":"9a445da647bdfb34cb618ecfedb8ba0b3fba81fb35659b534f13fed192960ace"} Apr 17 15:28:59.820609 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:28:59.820609 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerStarted","Data":"a9bd543e724bed253257b677535b4defecc705c744731f93f32a2de45427dad4"} Apr 17 15:29:00.827094 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:00.827057 2567 generic.go:358] "Generic (PLEG): container finished" podID="978ad55c-6893-4220-bff1-4ea0e2bc0a89" containerID="daa25d95aea8105ec9f2cccc570371b7d5fa7aa97093e352c56715f0a90e79ba" exitCode=0 Apr 17 15:29:00.827498 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:00.827132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" event={"ID":"978ad55c-6893-4220-bff1-4ea0e2bc0a89","Type":"ContainerDied","Data":"daa25d95aea8105ec9f2cccc570371b7d5fa7aa97093e352c56715f0a90e79ba"} Apr 17 15:29:02.109645 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:02.109610 2567 scope.go:117] "RemoveContainer" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" Apr 17 15:29:03.110893 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:03.109944 2567 scope.go:117] "RemoveContainer" containerID="d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b" Apr 17 15:29:05.858013 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.857972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" event={"ID":"978ad55c-6893-4220-bff1-4ea0e2bc0a89","Type":"ContainerStarted","Data":"01fdf83f80ae071baa9acc8e046ae5fdb2d1ac29782a1dce178738cf52585db4"} Apr 17 15:29:05.858460 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.858211 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:29:05.859349 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.859334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/2.log" Apr 17 15:29:05.859695 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.859681 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/1.log" Apr 17 15:29:05.859998 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.859981 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" exitCode=2 Apr 17 15:29:05.860067 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.860049 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2"} Apr 17 15:29:05.860120 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.860082 2567 scope.go:117] "RemoveContainer" containerID="d51e02dcd955542c5cf39dcdc50e45baff1580121e4a615769e63123bf3f063b" Apr 17 15:29:05.860515 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.860485 2567 scope.go:117] "RemoveContainer" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" Apr 17 15:29:05.860745 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:05.860723 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:05.861592 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.861576 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/3.log" Apr 17 15:29:05.861970 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.861944 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/2.log" Apr 17 15:29:05.862327 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.862286 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" exitCode=2 Apr 17 15:29:05.862420 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.862348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421"} Apr 17 15:29:05.862745 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.862682 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:05.862926 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:05.862906 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:05.863677 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.863658 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="9a445da647bdfb34cb618ecfedb8ba0b3fba81fb35659b534f13fed192960ace" exitCode=0 Apr 17 15:29:05.863768 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.863686 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"9a445da647bdfb34cb618ecfedb8ba0b3fba81fb35659b534f13fed192960ace"} Apr 17 15:29:05.871869 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.871851 2567 scope.go:117] "RemoveContainer" containerID="10e127c581b8630b90bb0a1c90c4b0756a5a97db1b5788ed3de211dfc756d069" Apr 17 15:29:05.884066 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:05.884027 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" podStartSLOduration=7.66821859 podStartE2EDuration="11.884016674s" podCreationTimestamp="2026-04-17 15:28:54 +0000 UTC" firstStartedPulling="2026-04-17 15:29:00.827803729 +0000 UTC m=+714.219345742" lastFinishedPulling="2026-04-17 15:29:05.043601814 +0000 UTC m=+718.435143826" observedRunningTime="2026-04-17 15:29:05.881569916 +0000 UTC m=+719.273111951" watchObservedRunningTime="2026-04-17 15:29:05.884016674 +0000 UTC m=+719.275558709" Apr 17 15:29:06.181773 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.181741 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:29:06.181861 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.181781 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:29:06.868851 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.868826 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/3.log" Apr 17 15:29:06.869637 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.869614 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:06.869854 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:06.869832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:06.870670 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.870653 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/0.log" Apr 17 15:29:06.870989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.870970 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="e0b0ed05a666c1cff2d48ed1bc498a8878a2a53918849af0821c82744828f9dd" exitCode=2 Apr 17 15:29:06.871056 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.871039 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"e0b0ed05a666c1cff2d48ed1bc498a8878a2a53918849af0821c82744828f9dd"} Apr 17 15:29:06.871377 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.871360 2567 scope.go:117] "RemoveContainer" containerID="e0b0ed05a666c1cff2d48ed1bc498a8878a2a53918849af0821c82744828f9dd" Apr 17 15:29:06.872474 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:06.872457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/2.log" Apr 17 15:29:07.877853 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.877823 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/1.log" Apr 17 15:29:07.878295 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.878217 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/0.log" Apr 17 15:29:07.878564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.878545 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445" exitCode=2 Apr 17 15:29:07.878635 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.878615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445"} Apr 17 15:29:07.878678 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.878658 2567 scope.go:117] "RemoveContainer" containerID="e0b0ed05a666c1cff2d48ed1bc498a8878a2a53918849af0821c82744828f9dd" Apr 17 15:29:07.879128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:07.879110 2567 scope.go:117] "RemoveContainer" containerID="33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445" Apr 17 15:29:07.879369 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:07.879350 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:08.109751 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.109716 2567 scope.go:117] "RemoveContainer" containerID="89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c" Apr 17 15:29:08.665746 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.665655 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:08.665746 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.665690 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:08.884815 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.884787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/3.log" Apr 17 15:29:08.885286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.885180 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/2.log" Apr 17 15:29:08.885544 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.885522 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" exitCode=2 Apr 17 15:29:08.885621 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.885599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778"} Apr 17 15:29:08.885680 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.885651 2567 scope.go:117] "RemoveContainer" containerID="89398238092b9d985a5cc7dca75919429ba9dc00918505ed7236dfa66e1ddf5c" Apr 17 15:29:08.886056 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.886036 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:08.886341 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:08.886294 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:29:08.887081 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.887065 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/1.log" Apr 17 15:29:08.887673 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:08.887658 2567 scope.go:117] "RemoveContainer" containerID="33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445" Apr 17 15:29:08.887817 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:08.887802 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:09.893607 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:09.893580 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/3.log" Apr 17 15:29:10.797350 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:10.797284 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:29:10.797524 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:10.797366 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:29:10.797777 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:10.797759 2567 scope.go:117] "RemoveContainer" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" Apr 17 15:29:10.797969 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:10.797952 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:12.696916 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:12.696881 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:29:12.696916 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:12.696917 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:29:12.697387 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:12.697370 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:12.697584 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:12.697566 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:29:13.110041 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.109955 2567 scope.go:117] "RemoveContainer" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" Apr 17 15:29:13.912946 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.912920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/3.log" Apr 17 15:29:13.913443 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.913398 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/2.log" Apr 17 15:29:13.913768 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.913744 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" exitCode=2 Apr 17 15:29:13.913829 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.913813 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360"} Apr 17 15:29:13.913884 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.913854 2567 scope.go:117] "RemoveContainer" containerID="f1b6fc6770390af1653b94351f1054d729a0d7af64ea549b8335621f7d8006b5" Apr 17 15:29:13.914364 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:13.914339 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:29:13.914599 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:13.914578 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:29:14.919154 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:14.919131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/3.log" Apr 17 15:29:16.885496 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:16.885466 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-g559n" Apr 17 15:29:17.114506 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:17.114475 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:17.114696 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:17.114679 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:18.389599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:18.389558 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:29:18.389599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:18.389592 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:29:18.390084 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:18.390042 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:29:18.390259 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:18.390241 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:29:22.109541 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.109507 2567 scope.go:117] "RemoveContainer" containerID="33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445" Apr 17 15:29:22.954462 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.954436 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/2.log" Apr 17 15:29:22.954798 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.954782 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/1.log" Apr 17 15:29:22.955057 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.955037 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" exitCode=2 Apr 17 15:29:22.955123 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.955107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb"} Apr 17 15:29:22.955163 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.955148 2567 scope.go:117] "RemoveContainer" containerID="33880298c3c057a1c4eae612ec1dadcd8bdae84f973cbb38972d64993751a445" Apr 17 15:29:22.955576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:22.955558 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:22.955779 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:22.955762 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:23.960891 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:23.960865 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/2.log" Apr 17 15:29:24.109857 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:24.109823 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:24.110054 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:24.109883 2567 scope.go:117] "RemoveContainer" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" Apr 17 15:29:24.110119 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:24.110054 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:29:24.110119 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:24.110092 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:28.666377 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:28.666345 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:28.666867 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:28.666441 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:28.666867 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:28.666844 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:28.667059 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:28.667041 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:28.982336 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:28.982248 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:28.982492 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:28.982456 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:29.109523 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:29.109485 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:29.109731 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:29.109709 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:31.109979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:31.109939 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:29:31.110511 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:31.110189 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:29:35.109423 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:35.109387 2567 scope.go:117] "RemoveContainer" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" Apr 17 15:29:36.012599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.012519 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/3.log" Apr 17 15:29:36.012910 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.012893 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/2.log" Apr 17 15:29:36.013232 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.013201 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" exitCode=2 Apr 17 15:29:36.013351 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.013276 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6"} Apr 17 15:29:36.013351 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.013343 2567 scope.go:117] "RemoveContainer" containerID="ac52a4f75e09bd8a2e9009c76af4d17cc900a6b8622f30feeefd592c561359d2" Apr 17 15:29:36.013850 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:36.013829 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:29:36.014053 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:36.014033 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:37.018695 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:37.018664 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/3.log" Apr 17 15:29:39.109414 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:39.109387 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:39.109848 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:39.109563 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:29:40.796488 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:40.796455 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:29:40.796488 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:40.796496 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:29:40.796970 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:40.796940 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:29:40.797154 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:40.797134 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:41.109072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:41.108996 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:41.109072 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:41.109041 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:41.109249 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:41.109212 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:41.109249 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:41.109210 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:45.109576 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:45.109540 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:29:45.109993 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:45.109795 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:29:52.109354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:52.109302 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:53.088690 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.088662 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/4.log" Apr 17 15:29:53.089024 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.089009 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/3.log" Apr 17 15:29:53.089363 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.089339 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" exitCode=2 Apr 17 15:29:53.089429 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.089369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9"} Apr 17 15:29:53.089429 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.089418 2567 scope.go:117] "RemoveContainer" containerID="7d669d4048040fcb9a63f125ac6b056173329e14458c3b6fcf4771d3d9636778" Apr 17 15:29:53.089849 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:53.089827 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:29:53.090088 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:53.090069 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:29:54.094865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:54.094838 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/4.log" Apr 17 15:29:54.109771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:54.109754 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:29:54.109959 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:54.109942 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:29:55.109951 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:55.109918 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:55.110360 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:55.110022 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:56.105263 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.105232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/4.log" Apr 17 15:29:56.105618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.105601 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/3.log" Apr 17 15:29:56.105979 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.105952 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" exitCode=2 Apr 17 15:29:56.106096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.106028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3"} Apr 17 15:29:56.106096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.106077 2567 scope.go:117] "RemoveContainer" containerID="bab89ad268b1de121794f3b6d0263226e2629b69664aa1f8f22a946535ff6421" Apr 17 15:29:56.106516 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.106494 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:29:56.106745 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:56.106724 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:56.107643 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.107622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/3.log" Apr 17 15:29:56.108001 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.107988 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/2.log" Apr 17 15:29:56.108322 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.108294 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" exitCode=2 Apr 17 15:29:56.108394 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.108344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a"} Apr 17 15:29:56.108675 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.108663 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:29:56.108823 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:56.108809 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:56.118498 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.118301 2567 scope.go:117] "RemoveContainer" containerID="645eeacdde21c7f7d72283c82dbd0fd5e33d696acc378ca80202a7ff7288b8bb" Apr 17 15:29:56.181804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.181778 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:29:56.181804 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:56.181810 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:29:57.113980 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:57.113948 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/4.log" Apr 17 15:29:57.114722 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:57.114699 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:29:57.114969 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:57.114947 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:29:57.115864 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:57.115847 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/3.log" Apr 17 15:29:58.666395 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:58.666352 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:58.666395 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:58.666399 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:29:58.666865 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:58.666849 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:29:58.667087 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:29:58.667069 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:29:59.108923 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:29:59.108846 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:30:00.136860 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.136834 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/4.log" Apr 17 15:30:00.137434 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.137230 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/3.log" Apr 17 15:30:00.137579 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.137560 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" exitCode=2 Apr 17 15:30:00.137657 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.137635 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72"} Apr 17 15:30:00.137734 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.137690 2567 scope.go:117] "RemoveContainer" containerID="02366c6278e3d014688300bc66244f06595a941aa44db4a27e1692fc8eeae360" Apr 17 15:30:00.138101 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:00.138080 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:30:00.138376 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:00.138353 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:30:01.142802 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:01.142768 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/4.log" Apr 17 15:30:02.696775 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:02.696745 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:30:02.696775 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:02.696773 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:30:02.697205 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:02.697158 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:30:02.697404 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:02.697382 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:30:08.108934 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:08.108902 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:30:08.109364 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:08.109072 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:30:08.390096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:08.390065 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:30:08.390096 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:08.390101 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:30:08.390531 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:08.390517 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:30:08.390744 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:08.390727 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:30:10.109850 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:10.109820 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:30:10.110232 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:10.109968 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:30:11.109233 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:11.109203 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:30:11.109444 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:11.109426 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:30:18.108974 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:18.108936 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:30:18.109479 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:18.109193 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:30:19.109213 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:19.109178 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:30:19.109687 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:19.109467 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:30:23.109784 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:23.109751 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:30:23.110187 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:23.109963 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:30:23.110187 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:23.110005 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:30:24.238211 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.238177 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/4.log" Apr 17 15:30:24.238750 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.238732 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/3.log" Apr 17 15:30:24.239165 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.239120 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" exitCode=2 Apr 17 15:30:24.239529 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.239503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e"} Apr 17 15:30:24.239686 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.239662 2567 scope.go:117] "RemoveContainer" containerID="2b7e082637596f7499a6f2ed5059a82833911dca5b203a758bb82c76ce63e2b6" Apr 17 15:30:24.240469 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:24.240453 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:30:24.241297 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:24.241273 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:30:25.245830 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:25.245803 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/4.log" Apr 17 15:30:26.108933 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:26.108860 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:30:26.109097 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:26.109078 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:30:29.114274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:29.114246 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:30:29.114675 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:29.114438 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:30:30.797409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:30.797370 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:30:30.797409 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:30.797412 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:30:30.797840 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:30.797827 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:30:30.798044 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:30.798028 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:30:34.109027 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:34.108993 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:30:34.109501 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:34.109196 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:30:37.113758 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:37.113728 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:30:37.114172 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:37.113835 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:30:37.114172 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:37.113924 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:30:38.309844 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.309817 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/4.log" Apr 17 15:30:38.310274 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.310183 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/3.log" Apr 17 15:30:38.310528 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.310506 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" exitCode=2 Apr 17 15:30:38.310609 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.310585 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f"} Apr 17 15:30:38.310671 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.310638 2567 scope.go:117] "RemoveContainer" containerID="51dabb1f3908d6b4e4e8350d070bce804b032fd944c2dbdcab5ec3bf01e66c2a" Apr 17 15:30:38.311093 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.311069 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:30:38.311348 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:38.311329 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:30:38.665876 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.665845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:30:38.666039 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:38.665888 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:30:39.316082 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:39.316056 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/4.log" Apr 17 15:30:39.316829 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:39.316797 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:30:39.317032 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:39.317013 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:30:41.109435 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:41.109395 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:30:41.109915 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:41.109657 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:30:46.108902 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:46.108871 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:30:46.109414 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:46.109091 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:30:47.112899 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:47.112869 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:30:47.113418 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:47.113121 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:30:51.108952 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:51.108913 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:30:51.109455 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:51.109135 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:30:52.109381 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:52.109342 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:30:52.109770 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:52.109524 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:30:55.109579 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:55.109543 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:30:55.109967 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:55.109752 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:30:58.108974 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:30:58.108945 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:30:58.109369 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:30:58.109137 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:31:00.109136 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:00.109098 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:31:00.109624 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:00.109373 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:31:04.109648 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:04.109616 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:31:04.110020 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:04.109832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:31:05.109474 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:05.109440 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:31:05.109710 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:05.109686 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:31:09.109176 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:09.109144 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:31:09.109693 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:09.109368 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:31:13.109882 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:13.109851 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:31:13.110283 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:13.110044 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:31:15.109653 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:15.109624 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:31:15.110030 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:15.109780 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:31:19.109837 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.109802 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:31:19.110286 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.109940 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:31:19.110286 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:19.110137 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:31:19.487010 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.486990 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/5.log" Apr 17 15:31:19.487416 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.487403 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/4.log" Apr 17 15:31:19.487719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.487699 2567 generic.go:358] "Generic (PLEG): container finished" podID="73ad47c6-e149-436f-b405-396b949ce55e" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" exitCode=2 Apr 17 15:31:19.487771 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.487753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" event={"ID":"73ad47c6-e149-436f-b405-396b949ce55e","Type":"ContainerDied","Data":"c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63"} Apr 17 15:31:19.487807 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.487783 2567 scope.go:117] "RemoveContainer" containerID="b0b1b4342eb0e44e5d9afe321a3eb8ab347350b96ff76e52876981d7e28463c3" Apr 17 15:31:19.488166 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:19.488147 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:31:19.488427 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:19.488409 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:31:20.493030 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:20.493000 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/5.log" Apr 17 15:31:22.109638 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.109604 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:31:22.502654 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.502626 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/5.log" Apr 17 15:31:22.503090 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.503074 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/4.log" Apr 17 15:31:22.503414 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.503386 2567 generic.go:358] "Generic (PLEG): container finished" podID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" exitCode=2 Apr 17 15:31:22.503486 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.503451 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" event={"ID":"2fe3fc1b-c251-430d-8d5a-2011e0193f81","Type":"ContainerDied","Data":"5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b"} Apr 17 15:31:22.503530 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.503499 2567 scope.go:117] "RemoveContainer" containerID="5f7a671475afb5be5532ed0bf884884b50a5dbc06dcfaeff1070c943b4eb8ff9" Apr 17 15:31:22.503944 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.503922 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:31:22.504159 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:22.504143 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:31:22.696946 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.696919 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:31:22.696946 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:22.696948 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" Apr 17 15:31:23.518958 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:23.518930 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/5.log" Apr 17 15:31:23.519687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:23.519668 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:31:23.519876 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:23.519857 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:31:25.109412 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:25.109382 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:31:25.109809 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:25.109581 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:31:26.181939 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:26.181909 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:31:26.181939 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:26.181943 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" Apr 17 15:31:26.182390 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:26.182285 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:31:26.182490 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:26.182473 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:31:29.108936 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.108904 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:31:29.545618 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.545593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/5.log" Apr 17 15:31:29.545993 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.545979 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/4.log" Apr 17 15:31:29.546271 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.546252 2567 generic.go:358] "Generic (PLEG): container finished" podID="97964fda-6e6e-42d3-880f-e40d7b19cc02" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" exitCode=2 Apr 17 15:31:29.546370 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.546348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" event={"ID":"97964fda-6e6e-42d3-880f-e40d7b19cc02","Type":"ContainerDied","Data":"573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5"} Apr 17 15:31:29.546432 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.546392 2567 scope.go:117] "RemoveContainer" containerID="b2795d2ad99311323d61fcf17884284ce0d094a0198123d6e12ec353a4077e72" Apr 17 15:31:29.546777 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:29.546759 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:31:29.546993 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:29.546975 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:31:30.551486 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:30.551459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/5.log" Apr 17 15:31:34.109795 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:34.109764 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:31:34.110166 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:34.109946 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:31:35.109386 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:35.109351 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:31:35.109614 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:35.109589 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:31:38.109355 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:38.109303 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:31:38.109733 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:38.109509 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:31:38.390223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:38.390192 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:31:38.390223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:38.390228 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" Apr 17 15:31:38.390709 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:38.390689 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:31:38.390943 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:38.390922 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:31:39.109125 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:39.109090 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:31:39.109356 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:39.109298 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:31:46.109255 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:46.109225 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:31:46.109769 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:46.109375 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:31:46.109769 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:46.109426 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:31:46.109769 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:46.109551 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:31:50.109229 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:50.109194 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:31:50.109646 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:50.109448 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:31:51.109217 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.109186 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:31:51.646363 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.646334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/5.log" Apr 17 15:31:51.646834 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.646774 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/4.log" Apr 17 15:31:51.647117 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.647094 2567 generic.go:358] "Generic (PLEG): container finished" podID="883d2618-0022-4c27-a907-543f886010a9" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" exitCode=2 Apr 17 15:31:51.647186 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.647167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" event={"ID":"883d2618-0022-4c27-a907-543f886010a9","Type":"ContainerDied","Data":"a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781"} Apr 17 15:31:51.647223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.647210 2567 scope.go:117] "RemoveContainer" containerID="2316ecd0d332f27c529dd61757806592ddd47af429cee72c8befc1cc7393869e" Apr 17 15:31:51.647687 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:51.647636 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:31:51.647917 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:51.647900 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:31:52.109676 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:52.109595 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:31:52.109829 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:52.109794 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:31:52.652834 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:52.652805 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/5.log" Apr 17 15:31:59.109722 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:31:59.109643 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:31:59.110095 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:31:59.109834 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:32:00.108997 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.108966 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:32:00.685877 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.685846 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:00.686289 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.686188 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/4.log" Apr 17 15:32:00.686538 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.686516 2567 generic.go:358] "Generic (PLEG): container finished" podID="eae4f1d7-a3b6-46de-922e-f92af4e43388" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" exitCode=2 Apr 17 15:32:00.686616 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.686593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" event={"ID":"eae4f1d7-a3b6-46de-922e-f92af4e43388","Type":"ContainerDied","Data":"a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f"} Apr 17 15:32:00.686655 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.686645 2567 scope.go:117] "RemoveContainer" containerID="5d74a09569a087015847f099616f50845bd4904af383982b4fe057c3c5d8931f" Apr 17 15:32:00.687076 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.687058 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:00.687323 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:00.687290 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:00.797007 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.796978 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:32:00.797007 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.797010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" Apr 17 15:32:00.797478 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:00.797461 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:32:00.797661 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:00.797645 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:32:01.692494 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:01.692466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:05.109721 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:05.109692 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:32:05.110088 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:05.109893 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:32:05.110088 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:05.109928 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:32:05.110166 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:05.110147 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:32:08.665740 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:08.665705 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:32:08.665740 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:08.665744 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" Apr 17 15:32:08.666265 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:08.666237 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:08.666577 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:08.666554 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:09.289964 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.289932 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:09.291116 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.291093 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/5.log" Apr 17 15:32:09.291247 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.291096 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:09.291756 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.291738 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/5.log" Apr 17 15:32:09.292182 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.292159 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/5.log" Apr 17 15:32:09.292294 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.292280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/5.log" Apr 17 15:32:09.292768 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.292751 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/5.log" Apr 17 15:32:09.292863 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.292850 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/5.log" Apr 17 15:32:09.293280 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.293267 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/5.log" Apr 17 15:32:09.293887 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:09.293868 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/5.log" Apr 17 15:32:10.109785 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:10.109755 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:32:10.110166 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:10.109918 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:32:13.109393 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:13.109361 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:32:13.109776 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:13.109579 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:32:16.109883 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:16.109850 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:32:16.110296 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:16.110070 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:32:20.109834 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:20.109803 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:32:20.110223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:20.109929 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:20.110223 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:20.110029 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:32:20.110223 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:20.110081 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:24.109102 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:24.109069 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:32:24.109599 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:24.109133 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:32:24.109599 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:24.109273 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:32:24.109599 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:24.109301 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:32:27.112893 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:27.112862 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:32:27.113302 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:27.113061 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:32:32.108836 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:32.108799 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:32:32.111194 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:32.108999 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:32:33.109407 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:33.109377 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:33.109798 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:33.109579 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:39.109457 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:39.109428 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:32:39.109860 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:39.109581 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:32:39.109860 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:39.109633 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:32:39.109860 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:39.109742 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:32:40.109734 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:40.109695 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:32:40.110159 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:40.109872 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:32:44.109038 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:44.109007 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:44.109471 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:44.109200 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:45.109485 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:45.109454 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:32:45.109967 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:45.109664 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:32:48.316354 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:48.316324 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:48.456669 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:48.456642 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:48.587162 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:48.587083 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/storage-initializer/0.log" Apr 17 15:32:52.109090 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:52.109055 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:32:52.109502 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:52.109226 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:32:53.109690 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:53.109656 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:32:53.110067 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:53.109861 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:32:53.597961 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:53.597931 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-c66kx_8cb081aa-6157-47cb-8014-89e70208a3d0/manager/0.log" Apr 17 15:32:53.721337 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:53.721277 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-698ccc456c-xjxql_08fa5197-d6b6-497f-bc14-9c60e27747f8/maas-api/0.log" Apr 17 15:32:53.852627 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:53.852550 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-796df7bcdb-64phj_c2e5acae-41c9-4108-8435-f58254d5ce79/manager/0.log" Apr 17 15:32:53.975940 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:53.975911 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bzprl_6cd03577-1ebb-4eff-8aff-97e5177167f0/manager/1.log" Apr 17 15:32:54.091581 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:54.091556 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-49nqx_27b103cb-80f5-4d9f-9bc7-a1812dddb90b/manager/0.log" Apr 17 15:32:54.109444 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:54.109372 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:32:54.109631 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:54.109610 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:32:54.478670 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:54.478637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wl7nz_8ab53a7e-e165-4e21-abe7-b7b332046d0a/postgres/0.log" Apr 17 15:32:55.289989 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.289957 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/extract/0.log" Apr 17 15:32:55.295658 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.295632 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/util/0.log" Apr 17 15:32:55.301513 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.301490 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/pull/0.log" Apr 17 15:32:55.415076 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.415049 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/extract/0.log" Apr 17 15:32:55.421034 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.421009 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/util/0.log" Apr 17 15:32:55.427719 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.427694 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/pull/0.log" Apr 17 15:32:55.534278 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.534251 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/pull/0.log" Apr 17 15:32:55.539450 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.539433 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/extract/0.log" Apr 17 15:32:55.546478 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.546431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/util/0.log" Apr 17 15:32:55.665740 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.665714 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/util/0.log" Apr 17 15:32:55.671752 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.671733 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/pull/0.log" Apr 17 15:32:55.677236 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.677220 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/extract/0.log" Apr 17 15:32:55.918564 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:55.918535 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-ts8sf_761f932c-6cb0-40a7-a949-ab44709b017d/manager/0.log" Apr 17 15:32:56.109710 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:56.109680 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:32:56.109892 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:56.109869 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:32:57.068159 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.068130 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zcfm7_31b7a54e-f948-41b1-8ab5-1edb1c1f74e0/discovery/0.log" Apr 17 15:32:57.112991 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.112962 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:32:57.113157 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:32:57.113129 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:32:57.296499 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.296469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-546dd5d8dc-bcsjj_709f8d01-1d92-4090-a398-2530cfd1ed0e/kube-auth-proxy/0.log" Apr 17 15:32:57.534126 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.534094 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58ccc558bb-xngk4_080f2dec-182b-40e8-adf6-95cf8c5342c7/router/0.log" Apr 17 15:32:57.904847 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.904821 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/storage-initializer/0.log" Apr 17 15:32:57.910881 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:57.910863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_883d2618-0022-4c27-a907-543f886010a9/main/5.log" Apr 17 15:32:58.021551 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.021526 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/storage-initializer/0.log" Apr 17 15:32:58.027809 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.027791 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-5z84j_97964fda-6e6e-42d3-880f-e40d7b19cc02/main/5.log" Apr 17 15:32:58.139682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.139659 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-g559n_978ad55c-6893-4220-bff1-4ea0e2bc0a89/storage-initializer/0.log" Apr 17 15:32:58.146275 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.146253 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-g559n_978ad55c-6893-4220-bff1-4ea0e2bc0a89/main/0.log" Apr 17 15:32:58.255527 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.255449 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/storage-initializer/0.log" Apr 17 15:32:58.261507 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.261466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_2fe3fc1b-c251-430d-8d5a-2011e0193f81/main/5.log" Apr 17 15:32:58.387457 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.387430 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/storage-initializer/0.log" Apr 17 15:32:58.394348 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.394325 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_eae4f1d7-a3b6-46de-922e-f92af4e43388/main/5.log" Apr 17 15:32:58.521634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.521561 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/storage-initializer/0.log" Apr 17 15:32:58.527435 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:32:58.527413 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_73ad47c6-e149-436f-b405-396b949ce55e/main/5.log" Apr 17 15:33:05.109814 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:05.109778 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:33:05.110220 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:05.109977 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:33:05.742159 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:05.742128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z8tnm_ea6be3e1-87f0-4c68-b704-4a21dbd76850/global-pull-secret-syncer/0.log" Apr 17 15:33:05.841193 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:05.841164 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mxqss_eff92ac3-7f84-4934-b020-eda543896879/konnectivity-agent/0.log" Apr 17 15:33:05.864682 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:05.864657 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-92.ec2.internal_8c8041c1c70e4c5187623822d69c931b/haproxy/0.log" Apr 17 15:33:06.109686 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:06.109600 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:33:06.109864 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:06.109774 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:33:06.109864 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:06.109792 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:33:06.110223 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:06.109980 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:33:09.113647 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.113611 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:33:09.114128 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.113782 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:33:09.114128 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:09.113821 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:33:09.114128 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:09.113997 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:33:09.688033 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.688006 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/extract/0.log" Apr 17 15:33:09.708463 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.708438 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/util/0.log" Apr 17 15:33:09.729147 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.729122 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759q9lbk_a99383d9-65b6-4ee3-ab0d-6c0ec759718c/pull/0.log" Apr 17 15:33:09.758919 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.758894 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/extract/0.log" Apr 17 15:33:09.783932 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.783896 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/util/0.log" Apr 17 15:33:09.804725 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.804704 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rnrv4_37a35ca9-255c-451b-9bad-8be8bf2a870e/pull/0.log" Apr 17 15:33:09.833403 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.833377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/extract/0.log" Apr 17 15:33:09.859269 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.859249 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/util/0.log" Apr 17 15:33:09.883574 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.883552 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736vg64_cccdfbc3-a737-4bd7-8b48-d6c109a8e987/pull/0.log" Apr 17 15:33:09.907674 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.907652 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/extract/0.log" Apr 17 15:33:09.931430 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.931413 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/util/0.log" Apr 17 15:33:09.968812 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:09.968761 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pgk98_3cf6f2c5-ccf3-4836-b98e-e30127cf91f8/pull/0.log" Apr 17 15:33:10.020419 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:10.020396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-ts8sf_761f932c-6cb0-40a7-a949-ab44709b017d/manager/0.log" Apr 17 15:33:11.641224 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.641194 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/alertmanager/0.log" Apr 17 15:33:11.664556 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.664494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/config-reloader/0.log" Apr 17 15:33:11.687218 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.687202 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/kube-rbac-proxy-web/0.log" Apr 17 15:33:11.711446 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.711413 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/kube-rbac-proxy/0.log" Apr 17 15:33:11.738408 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.738389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/kube-rbac-proxy-metric/0.log" Apr 17 15:33:11.763362 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.763344 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/prom-label-proxy/0.log" Apr 17 15:33:11.785322 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.785299 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6f5c3978-c304-4ce4-a24f-e298635f0b6c/init-config-reloader/0.log" Apr 17 15:33:11.817169 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.817147 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-n8pd7_4261f15f-644e-4914-8e45-1bfa8a2447d7/cluster-monitoring-operator/0.log" Apr 17 15:33:11.986019 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:11.985944 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/node-exporter/0.log" Apr 17 15:33:12.009331 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:12.009299 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/kube-rbac-proxy/0.log" Apr 17 15:33:12.036302 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:12.036279 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vvjb_522b9c9d-a847-4b8a-971b-6f6ac840eae0/init-textfile/0.log" Apr 17 15:33:12.214138 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:12.214098 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/kube-rbac-proxy-main/0.log" Apr 17 15:33:12.235959 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:12.235930 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/kube-rbac-proxy-self/0.log" Apr 17 15:33:12.255361 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:12.255280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gct64_ff092072-7608-4ec7-8227-561dd01cd73c/openshift-state-metrics/0.log" Apr 17 15:33:13.886491 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:13.886449 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lmx8c_cf60d958-a515-4c65-8fd2-9bd9d19fa3ab/networking-console-plugin/0.log" Apr 17 15:33:14.300345 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.300249 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm"] Apr 17 15:33:14.303929 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.303911 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.306199 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.306174 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"openshift-service-ca.crt\"" Apr 17 15:33:14.307361 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.307343 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"kube-root-ca.crt\"" Apr 17 15:33:14.307452 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.307343 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-79652\"/\"default-dockercfg-zpffh\"" Apr 17 15:33:14.312848 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.312824 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm"] Apr 17 15:33:14.457635 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.457600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-sys\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.457635 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.457637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-lib-modules\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.457850 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.457709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhdzn\" (UniqueName: \"kubernetes.io/projected/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-kube-api-access-zhdzn\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.457850 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.457730 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-podres\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.457850 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.457829 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-proc\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-sys\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559244 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-lib-modules\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559248 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhdzn\" (UniqueName: \"kubernetes.io/projected/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-kube-api-access-zhdzn\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-podres\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559282 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-sys\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559345 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-proc\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-proc\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559461 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-lib-modules\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.559660 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.559464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-podres\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.567487 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.567465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhdzn\" (UniqueName: \"kubernetes.io/projected/450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4-kube-api-access-zhdzn\") pod \"perf-node-gather-daemonset-ln2bm\" (UID: \"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.615397 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.615371 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:14.742143 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.742112 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm"] Apr 17 15:33:14.743101 ip-10-0-130-92 kubenswrapper[2567]: W0417 15:33:14.743068 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod450f4fd5_cf86_49fb_9d93_97fc3cd4e3d4.slice/crio-2a03034135afd6ce900a42c6b4cdd4ff2dee55260fb9a871fb405a597a95456e WatchSource:0}: Error finding container 2a03034135afd6ce900a42c6b4cdd4ff2dee55260fb9a871fb405a597a95456e: Status 404 returned error can't find the container with id 2a03034135afd6ce900a42c6b4cdd4ff2dee55260fb9a871fb405a597a95456e Apr 17 15:33:14.744766 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:14.744750 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:33:15.019063 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:15.019028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" event={"ID":"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4","Type":"ContainerStarted","Data":"13bee53f8371cb2cfd08207dbf8496fbccc723327bd6413b0dce18a88631bd80"} Apr 17 15:33:15.019063 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:15.019064 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" event={"ID":"450f4fd5-cf86-49fb-9d93-97fc3cd4e3d4","Type":"ContainerStarted","Data":"2a03034135afd6ce900a42c6b4cdd4ff2dee55260fb9a871fb405a597a95456e"} Apr 17 15:33:15.019525 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:15.019148 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:15.035791 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:15.035742 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" podStartSLOduration=1.035728617 podStartE2EDuration="1.035728617s" podCreationTimestamp="2026-04-17 15:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:33:15.03333104 +0000 UTC m=+968.424873066" watchObservedRunningTime="2026-04-17 15:33:15.035728617 +0000 UTC m=+968.427270651" Apr 17 15:33:15.398477 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:15.398445 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-kx2v7_4dc26639-dd19-46cc-97ba-c69e0b27c74e/volume-data-source-validator/0.log" Apr 17 15:33:16.109607 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:16.109583 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:33:16.109977 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:16.109773 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:33:16.217786 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:16.217762 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-78c86_b2ad6199-9c63-412c-b433-b95e9dec556b/dns/0.log" Apr 17 15:33:16.239131 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:16.239111 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-78c86_b2ad6199-9c63-412c-b433-b95e9dec556b/kube-rbac-proxy/0.log" Apr 17 15:33:16.354294 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:16.354265 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r2ffn_a2a09a76-6aeb-4520-9efb-287cddc7f75b/dns-node-resolver/0.log" Apr 17 15:33:16.809964 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:16.809936 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4n2f7_1f42d472-7136-4d33-b081-4e8ae758480e/node-ca/0.log" Apr 17 15:33:17.757784 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:17.757752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zcfm7_31b7a54e-f948-41b1-8ab5-1edb1c1f74e0/discovery/0.log" Apr 17 15:33:17.806589 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:17.806564 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-546dd5d8dc-bcsjj_709f8d01-1d92-4090-a398-2530cfd1ed0e/kube-auth-proxy/0.log" Apr 17 15:33:17.859024 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:17.858995 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58ccc558bb-xngk4_080f2dec-182b-40e8-adf6-95cf8c5342c7/router/0.log" Apr 17 15:33:18.354165 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.354138 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jpn8m_eeb34dd9-f023-4a23-8830-151d5b605625/serve-healthcheck-canary/0.log" Apr 17 15:33:18.751577 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.751546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wvmmd_4270e3fe-b069-4a89-bd6d-10514be6fb65/insights-operator/0.log" Apr 17 15:33:18.752765 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.752748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wvmmd_4270e3fe-b069-4a89-bd6d-10514be6fb65/insights-operator/1.log" Apr 17 15:33:18.832223 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.832183 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gs7lp_737f8ec5-4bba-47e1-95e8-4699679c5200/kube-rbac-proxy/0.log" Apr 17 15:33:18.850963 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.850942 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gs7lp_737f8ec5-4bba-47e1-95e8-4699679c5200/exporter/0.log" Apr 17 15:33:18.869955 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:18.869935 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gs7lp_737f8ec5-4bba-47e1-95e8-4699679c5200/extractor/0.log" Apr 17 15:33:20.109397 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.109367 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:33:20.109782 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.109472 2567 scope.go:117] "RemoveContainer" containerID="a49d71cd99208f238b863456d4294f8ac933e93b2cfd8e9f917add664cf1849f" Apr 17 15:33:20.109782 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:20.109577 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:33:20.109782 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:20.109632 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x_llm(eae4f1d7-a3b6-46de-922e-f92af4e43388)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-p8f5x" podUID="eae4f1d7-a3b6-46de-922e-f92af4e43388" Apr 17 15:33:20.720226 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.720192 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-c66kx_8cb081aa-6157-47cb-8014-89e70208a3d0/manager/0.log" Apr 17 15:33:20.740592 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.740566 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-698ccc456c-xjxql_08fa5197-d6b6-497f-bc14-9c60e27747f8/maas-api/0.log" Apr 17 15:33:20.764447 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.764423 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-796df7bcdb-64phj_c2e5acae-41c9-4108-8435-f58254d5ce79/manager/0.log" Apr 17 15:33:20.787321 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.787295 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bzprl_6cd03577-1ebb-4eff-8aff-97e5177167f0/manager/0.log" Apr 17 15:33:20.797040 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.797012 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bzprl_6cd03577-1ebb-4eff-8aff-97e5177167f0/manager/1.log" Apr 17 15:33:20.816326 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.816289 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-49nqx_27b103cb-80f5-4d9f-9bc7-a1812dddb90b/manager/0.log" Apr 17 15:33:20.899305 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:20.899283 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wl7nz_8ab53a7e-e165-4e21-abe7-b7b332046d0a/postgres/0.log" Apr 17 15:33:21.033033 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:21.032961 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-ln2bm" Apr 17 15:33:21.108984 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:21.108957 2567 scope.go:117] "RemoveContainer" containerID="573291715f13d35799ac8e8857fecdbc8626c8f55be1951ac36ca5f81f64e4f5" Apr 17 15:33:21.109167 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:21.109148 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-5z84j_llm(97964fda-6e6e-42d3-880f-e40d7b19cc02)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-5z84j" podUID="97964fda-6e6e-42d3-880f-e40d7b19cc02" Apr 17 15:33:22.007091 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:22.007053 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-5p64l_7da6be51-382d-470e-9b6e-dee16a790c0e/openshift-lws-operator/0.log" Apr 17 15:33:24.109561 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:24.109523 2567 scope.go:117] "RemoveContainer" containerID="c579811e5c8a53efc6abde4330155d597d9e9a1228612667c1fefc7edbba6e63" Apr 17 15:33:24.110024 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:24.109795 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-z7jcr_llm(73ad47c6-e149-436f-b405-396b949ce55e)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-z7jcr" podUID="73ad47c6-e149-436f-b405-396b949ce55e" Apr 17 15:33:26.162624 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:26.162537 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fbbct_7ba1a898-bdbb-4176-a7e2-3447d8e9254d/migrator/0.log" Apr 17 15:33:26.185450 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:26.185430 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fbbct_7ba1a898-bdbb-4176-a7e2-3447d8e9254d/graceful-termination/0.log" Apr 17 15:33:26.580259 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:26.580218 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hchf4_9080ca44-1027-491b-9bf1-12443cd3b452/kube-storage-version-migrator-operator/1.log" Apr 17 15:33:26.581224 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:26.581207 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hchf4_9080ca44-1027-491b-9bf1-12443cd3b452/kube-storage-version-migrator-operator/0.log" Apr 17 15:33:27.831634 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.831600 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/kube-multus-additional-cni-plugins/0.log" Apr 17 15:33:27.853365 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.853341 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/egress-router-binary-copy/0.log" Apr 17 15:33:27.875636 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.875615 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/cni-plugins/0.log" Apr 17 15:33:27.894859 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.894837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/bond-cni-plugin/0.log" Apr 17 15:33:27.916968 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.916948 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/routeoverride-cni/0.log" Apr 17 15:33:27.936548 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.936526 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/whereabouts-cni-bincopy/0.log" Apr 17 15:33:27.955328 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:27.955293 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhh97_ffaaed56-110b-4fd5-9fbe-e8e71f6de33d/whereabouts-cni/0.log" Apr 17 15:33:28.032759 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:28.032737 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wn95c_9fb501e8-358b-4ede-bb90-e53237beeef0/kube-multus/0.log" Apr 17 15:33:28.144977 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:28.144954 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n8fjz_48990b07-a036-41ef-a6cd-89d7520c417c/network-metrics-daemon/0.log" Apr 17 15:33:28.161912 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:28.161889 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n8fjz_48990b07-a036-41ef-a6cd-89d7520c417c/kube-rbac-proxy/0.log" Apr 17 15:33:29.109075 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.109040 2567 scope.go:117] "RemoveContainer" containerID="5991a206b376bb367500d4776410ee905835abc348236d1cd7b6bb9617101a0b" Apr 17 15:33:29.109847 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:29.109818 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl_llm(2fe3fc1b-c251-430d-8d5a-2011e0193f81)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktptl" podUID="2fe3fc1b-c251-430d-8d5a-2011e0193f81" Apr 17 15:33:29.503801 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.503770 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/ovn-controller/0.log" Apr 17 15:33:29.523504 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.523475 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/ovn-acl-logging/0.log" Apr 17 15:33:29.541474 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.541446 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/kube-rbac-proxy-node/0.log" Apr 17 15:33:29.560432 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.560407 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 15:33:29.577349 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.577328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/northd/0.log" Apr 17 15:33:29.595488 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.595468 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/nbdb/0.log" Apr 17 15:33:29.615025 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.615007 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/sbdb/0.log" Apr 17 15:33:29.717359 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:29.717334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rphqm_c60e7d84-f238-44b7-91dd-6bebb34d4158/ovnkube-controller/0.log" Apr 17 15:33:30.822928 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:30.822898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6wr6j_d9daeb55-7347-4d29-a0ea-04ac78140a08/network-check-target-container/0.log" Apr 17 15:33:31.870839 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:31.870811 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-27fdl_30857904-b776-4486-98cc-f89642587b8a/iptables-alerter/0.log" Apr 17 15:33:32.109652 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:32.109623 2567 scope.go:117] "RemoveContainer" containerID="a0e63a8f728b1bf7f77d17d10cd1e32a81b11efff8ccb6904aec61f74186d781" Apr 17 15:33:32.109819 ip-10-0-130-92 kubenswrapper[2567]: E0417 15:33:32.109794 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h_llm(883d2618-0022-4c27-a907-543f886010a9)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-2gw6h" podUID="883d2618-0022-4c27-a907-543f886010a9" Apr 17 15:33:32.652459 ip-10-0-130-92 kubenswrapper[2567]: I0417 15:33:32.652429 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dxkdc_14d75cd8-6c31-4216-a34f-742e9cc2a898/tuned/0.log"