Feb 17 12:46:16.285436 ip-10-0-131-216 systemd[1]: Starting Kubernetes Kubelet... Feb 17 12:46:16.742479 ip-10-0-131-216 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 12:46:16.742479 ip-10-0-131-216 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 12:46:16.742479 ip-10-0-131-216 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 12:46:16.742479 ip-10-0-131-216 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Feb 17 12:46:16.742479 ip-10-0-131-216 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 12:46:16.744316 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.744227 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 12:46:16.749030 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749014 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Feb 17 12:46:16.749030 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749031 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749035 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749040 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749045 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749049 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749052 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749060 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749063 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749066 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749069 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749072 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749075 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749078 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749081 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749084 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749087 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749089 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749093 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749097 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Feb 17 12:46:16.749093 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749100 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749103 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749120 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749123 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749125 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749128 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749131 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749133 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749136 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749138 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749141 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749143 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749146 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749150 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749152 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749155 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749157 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749160 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749163 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749165 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Feb 17 12:46:16.749583 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749168 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749171 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749173 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749176 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749178 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749181 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749183 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749186 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749189 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749191 2573 feature_gate.go:328] unrecognized feature gate: Example Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749194 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749196 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749199 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749201 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749205 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749208 2573 feature_gate.go:328] unrecognized feature gate: Example2 Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749210 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749213 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749216 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749218 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Feb 17 12:46:16.750248 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749221 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749224 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749226 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749229 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749231 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749234 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749236 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749239 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749241 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749244 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749247 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749249 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749252 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749255 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749257 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749260 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749263 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749266 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749268 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749271 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Feb 17 12:46:16.750865 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749273 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Feb 17 12:46:16.751428 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749276 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Feb 17 12:46:16.751428 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749279 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Feb 17 12:46:16.751428 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749281 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Feb 17 12:46:16.751428 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749284 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Feb 17 12:46:16.751428 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.749286 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Feb 17 12:46:16.751923 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751906 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Feb 17 12:46:16.751923 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751921 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Feb 17 12:46:16.751923 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751925 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751929 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751932 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751935 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751938 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751941 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751943 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751946 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751949 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751952 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751954 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751957 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751960 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751963 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751966 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751970 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751974 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751976 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751979 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751982 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Feb 17 12:46:16.752047 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751984 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751987 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751989 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751992 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751995 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.751997 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752000 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752003 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752005 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752009 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752011 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752014 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752017 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752019 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752024 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752026 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752029 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752032 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752036 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Feb 17 12:46:16.752619 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752041 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752044 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752047 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752050 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752053 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752056 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752059 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752063 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752066 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752069 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752072 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752074 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752077 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752079 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752082 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752084 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752087 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752089 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752092 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Feb 17 12:46:16.753133 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752095 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752097 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752100 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752103 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752106 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752121 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752126 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752130 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752135 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752138 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752141 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752143 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752146 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752149 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752152 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752154 2573 feature_gate.go:328] unrecognized feature gate: Example2 Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752157 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752160 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752168 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752171 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Feb 17 12:46:16.753586 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752173 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752176 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752179 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752182 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752185 2573 feature_gate.go:328] unrecognized feature gate: Example Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752187 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752263 2573 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752271 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752279 2573 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752284 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752288 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752292 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752298 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752303 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752306 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752309 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752313 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752316 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752319 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752322 2573 flags.go:64] FLAG: --cgroup-root="" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752326 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752329 2573 flags.go:64] FLAG: --client-ca-file="" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752332 2573 flags.go:64] FLAG: --cloud-config="" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752336 2573 flags.go:64] FLAG: --cloud-provider="external" Feb 17 12:46:16.754075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752339 2573 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752343 2573 flags.go:64] FLAG: --cluster-domain="" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752346 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752349 2573 flags.go:64] FLAG: --config-dir="" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752352 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752356 2573 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752360 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752363 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752367 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752370 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752374 2573 flags.go:64] FLAG: --contention-profiling="false" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752377 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752380 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752383 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752386 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752390 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752394 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752397 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752400 2573 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752403 2573 flags.go:64] FLAG: --enable-server="true" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752405 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752410 2573 flags.go:64] FLAG: --event-burst="100" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752414 2573 flags.go:64] FLAG: --event-qps="50" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752417 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752420 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 12:46:16.754664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752423 2573 flags.go:64] FLAG: --eviction-hard="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752427 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752430 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752435 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752438 2573 flags.go:64] FLAG: --eviction-soft="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752442 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752445 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752448 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752451 2573 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752454 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752457 2573 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752460 2573 flags.go:64] FLAG: --feature-gates="" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752464 2573 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752467 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752470 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752475 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752478 2573 flags.go:64] FLAG: --healthz-port="10248" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752481 2573 flags.go:64] FLAG: --help="false" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752484 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-131-216.ec2.internal" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752487 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752490 2573 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752494 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752498 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Feb 17 12:46:16.755294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752501 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752504 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752507 2573 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752510 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752513 2573 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752516 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752520 2573 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752522 2573 flags.go:64] FLAG: --kube-reserved="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752526 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752529 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752532 2573 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752535 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752538 2573 flags.go:64] FLAG: --lock-file="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752541 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752544 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752547 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752553 2573 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752556 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752559 2573 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752563 2573 flags.go:64] FLAG: --logging-format="text" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752566 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752569 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752572 2573 flags.go:64] FLAG: --manifest-url="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752575 2573 flags.go:64] FLAG: --manifest-url-header="" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752580 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 12:46:16.755853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752583 2573 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752588 2573 flags.go:64] FLAG: --max-pods="110" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752591 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752594 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752597 2573 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752600 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752603 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752606 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752609 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752622 2573 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752625 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752628 2573 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752631 2573 flags.go:64] FLAG: --pod-cidr="" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752634 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca3deca44439f185f4632d34b1d894f5fa75cccf603cfd634a130c5928811e73" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752640 2573 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752643 2573 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752646 2573 flags.go:64] FLAG: --pods-per-core="0" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752649 2573 flags.go:64] FLAG: --port="10250" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752652 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752655 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01b014533623a7334" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752658 2573 flags.go:64] FLAG: --qos-reserved="" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752661 2573 flags.go:64] FLAG: --read-only-port="10255" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752664 2573 flags.go:64] FLAG: --register-node="true" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752668 2573 flags.go:64] FLAG: --register-schedulable="true" Feb 17 12:46:16.756574 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752671 2573 flags.go:64] FLAG: --register-with-taints="" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752675 2573 flags.go:64] FLAG: --registry-burst="10" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752678 2573 flags.go:64] FLAG: --registry-qps="5" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752680 2573 flags.go:64] FLAG: --reserved-cpus="" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752683 2573 flags.go:64] FLAG: --reserved-memory="" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752687 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752690 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752693 2573 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752696 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752699 2573 flags.go:64] FLAG: --runonce="false" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752702 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752705 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752708 2573 flags.go:64] FLAG: --seccomp-default="false" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752711 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752714 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752717 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752720 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752724 2573 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752727 2573 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752729 2573 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752732 2573 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752735 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752738 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752742 2573 flags.go:64] FLAG: --system-cgroups="" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752745 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Feb 17 12:46:16.757225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752751 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752754 2573 flags.go:64] FLAG: --tls-cert-file="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752756 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752761 2573 flags.go:64] FLAG: --tls-min-version="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752764 2573 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752767 2573 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752770 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752773 2573 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752776 2573 flags.go:64] FLAG: --v="2" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752780 2573 flags.go:64] FLAG: --version="false" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752784 2573 flags.go:64] FLAG: --vmodule="" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752788 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.752792 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752885 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752889 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752892 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752895 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752900 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752903 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752906 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752909 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752912 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 12:46:16.757816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752915 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752917 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752920 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752923 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752926 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752928 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752931 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752933 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752936 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752939 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752942 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752944 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752947 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752950 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752952 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752955 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752958 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752960 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752963 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Feb 17 12:46:16.758378 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752966 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752970 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752974 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752977 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752980 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752983 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752985 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752988 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752991 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752995 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.752998 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753001 2573 feature_gate.go:328] unrecognized feature gate: Example2 Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753003 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753009 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753011 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753014 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753016 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753019 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753022 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753025 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Feb 17 12:46:16.758881 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753027 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753030 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753032 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753035 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753038 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753040 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753045 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753048 2573 feature_gate.go:328] unrecognized feature gate: Example Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753051 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753054 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753056 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753059 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753062 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753064 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753067 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753069 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753072 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753074 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753077 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753080 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Feb 17 12:46:16.759431 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753082 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753085 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753088 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753091 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753093 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753098 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753101 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753103 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753106 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753128 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753132 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753136 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753139 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753142 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753145 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753148 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753151 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Feb 17 12:46:16.759911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.753154 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 12:46:16.760375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.753966 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Feb 17 12:46:16.760375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.760341 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.6" Feb 17 12:46:16.760375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.760358 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760407 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760413 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760416 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760420 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760424 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760427 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760430 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760432 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760435 2573 feature_gate.go:328] unrecognized feature gate: Example2 Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760438 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760440 2573 feature_gate.go:328] unrecognized feature gate: Example Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760443 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760445 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760449 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760451 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760454 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760457 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760460 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760462 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Feb 17 12:46:16.760464 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760465 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760468 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760471 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760474 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760477 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760479 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760482 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760484 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760487 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760489 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760492 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760495 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760498 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760501 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760503 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760506 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760509 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760511 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760514 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Feb 17 12:46:16.760932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760516 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760519 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760523 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760528 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760531 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760534 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760536 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760539 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760541 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760544 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760546 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760549 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760551 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760554 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760557 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760560 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760563 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760566 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760569 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Feb 17 12:46:16.761423 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760571 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760574 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760577 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760579 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760582 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760585 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760587 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760592 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760595 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760599 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760601 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760605 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760607 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760610 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760612 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760615 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760618 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760620 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760623 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Feb 17 12:46:16.761894 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760625 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760628 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760630 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760633 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760636 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760638 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760641 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760643 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760646 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760649 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.760654 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760750 2573 feature_gate.go:328] unrecognized feature gate: Example Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760754 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760757 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760760 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760764 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Feb 17 12:46:16.762361 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760768 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760771 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760774 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760777 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760779 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760783 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760786 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760788 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760791 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760793 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760796 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760799 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760801 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760804 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760806 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760809 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760811 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760814 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760816 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760818 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Feb 17 12:46:16.762750 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760821 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760824 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760826 2573 feature_gate.go:328] unrecognized feature gate: Example2 Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760829 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760831 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760834 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760837 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760839 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760842 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760845 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760847 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760850 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760852 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760854 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760857 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760860 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760862 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760864 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760867 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760870 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Feb 17 12:46:16.763252 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760872 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760875 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760877 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760880 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760882 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760885 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760888 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760890 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760892 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760895 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760898 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760900 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760903 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760905 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760908 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760910 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760913 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760916 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760919 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Feb 17 12:46:16.763735 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760921 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760924 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760926 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760929 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760932 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760934 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760937 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760939 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760942 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760945 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760948 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760950 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760953 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760957 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760960 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760963 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760966 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760969 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760971 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Feb 17 12:46:16.764209 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760974 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Feb 17 12:46:16.764647 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760976 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Feb 17 12:46:16.764647 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:16.760979 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Feb 17 12:46:16.764647 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.760983 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Feb 17 12:46:16.764647 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.761770 2573 server.go:962] "Client rotation is on, will bootstrap in background" Feb 17 12:46:16.764647 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.763759 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 17 12:46:16.764784 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.764675 2573 server.go:1019] "Starting client certificate rotation" Feb 17 12:46:16.764784 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.764773 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Feb 17 12:46:16.764839 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.764817 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Feb 17 12:46:16.791070 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.791047 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 12:46:16.792976 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.792951 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 12:46:16.805661 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.805632 2573 log.go:25] "Validated CRI v1 runtime API" Feb 17 12:46:16.813088 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.813067 2573 log.go:25] "Validated CRI v1 image API" Feb 17 12:46:16.815496 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.815478 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 12:46:16.818055 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.818031 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Feb 17 12:46:16.820820 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.820800 2573 fs.go:135] Filesystem UUIDs: map[754374b2-7c71-4c21-b8cd-7c0407061725:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8f0c0bee-3cf9-41a0-8447-5721385536f1:/dev/nvme0n1p4] Feb 17 12:46:16.820885 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.820818 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Feb 17 12:46:16.826074 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.825960 2573 manager.go:217] Machine: {Timestamp:2026-02-17 12:46:16.824614249 +0000 UTC m=+0.419429065 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3109015 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c08553331558bdc48b7472387eac0 SystemUUID:ec2c0855-3331-558b-dc48-b7472387eac0 BootID:6ba39e72-8089-42e1-b0ae-9ba7dc83b3ae Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6090752 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:06:8b:6e:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:06:8b:6e:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:5d:f5:ba:b4:02 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 12:46:16.826074 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.826066 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 12:46:16.826205 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.826165 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.86.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260204-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 12:46:16.828034 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828001 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 12:46:16.828190 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828036 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-216.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 12:46:16.828242 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828200 2573 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 12:46:16.828242 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828210 2573 container_manager_linux.go:306] "Creating device plugin manager" Feb 17 12:46:16.828242 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828218 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 12:46:16.828997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.828986 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 12:46:16.830627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.830616 2573 state_mem.go:36] "Initialized new in-memory state store" Feb 17 12:46:16.830764 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.830755 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Feb 17 12:46:16.833461 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.833450 2573 kubelet.go:491] "Attempting to sync node with API server" Feb 17 12:46:16.833504 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.833466 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 12:46:16.833504 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.833480 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 12:46:16.833504 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.833491 2573 kubelet.go:397] "Adding apiserver pod source" Feb 17 12:46:16.833504 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.833505 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 12:46:16.834632 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.834619 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Feb 17 12:46:16.834678 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.834639 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Feb 17 12:46:16.836346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.836327 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n4dns" Feb 17 12:46:16.837624 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.837610 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-2.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Feb 17 12:46:16.840233 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.840219 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Feb 17 12:46:16.842816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842799 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 12:46:16.842816 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842817 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842823 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842829 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842835 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842842 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842847 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842853 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842860 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842866 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 12:46:16.842943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.842882 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 12:46:16.843336 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.843325 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 12:46:16.843431 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.843414 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n4dns" Feb 17 12:46:16.844331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.844315 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 12:46:16.844331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.844326 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Feb 17 12:46:16.844951 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.844920 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Feb 17 12:46:16.845013 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.844930 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-216.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Feb 17 12:46:16.848352 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.848332 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Feb 17 12:46:16.848441 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.848384 2573 server.go:1295] "Started kubelet" Feb 17 12:46:16.848540 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.848479 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 12:46:16.848691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.848483 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 12:46:16.848771 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.848710 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Feb 17 12:46:16.849483 ip-10-0-131-216 systemd[1]: Started Kubernetes Kubelet. Feb 17 12:46:16.850414 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.850397 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 12:46:16.851778 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.851763 2573 server.go:317] "Adding debug handlers to kubelet server" Feb 17 12:46:16.853906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.853887 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-216.ec2.internal" not found Feb 17 12:46:16.856436 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856415 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 12:46:16.856532 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856437 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Feb 17 12:46:16.856532 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856489 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Feb 17 12:46:16.856532 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856498 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Feb 17 12:46:16.856662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856581 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Feb 17 12:46:16.856662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856632 2573 reconstruct.go:97] "Volume reconstruction finished" Feb 17 12:46:16.856662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.856637 2573 reconciler.go:26] "Reconciler: start to sync state" Feb 17 12:46:16.857494 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.857475 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:16.857867 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857852 2573 factory.go:153] Registering CRI-O factory Feb 17 12:46:16.857941 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857872 2573 factory.go:223] Registration of the crio container factory successfully Feb 17 12:46:16.857941 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857928 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 12:46:16.857941 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857936 2573 factory.go:55] Registering systemd factory Feb 17 12:46:16.858048 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857944 2573 factory.go:223] Registration of the systemd container factory successfully Feb 17 12:46:16.858048 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857966 2573 factory.go:103] Registering Raw factory Feb 17 12:46:16.858048 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.857979 2573 manager.go:1196] Started watching for new ooms in manager Feb 17 12:46:16.858861 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.858839 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Feb 17 12:46:16.859012 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.858997 2573 manager.go:319] Starting recovery of all containers Feb 17 12:46:16.859218 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.859197 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:16.861814 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.861786 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Feb 17 12:46:16.868480 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.868325 2573 manager.go:324] Recovery completed Feb 17 12:46:16.872771 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.872752 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-216.ec2.internal" not found Feb 17 12:46:16.874085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.874071 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:16.875970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.875957 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:16.876020 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.875984 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:16.876020 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.875994 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:16.876474 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.876456 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Feb 17 12:46:16.876474 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.876465 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Feb 17 12:46:16.876566 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.876481 2573 state_mem.go:36] "Initialized new in-memory state store" Feb 17 12:46:16.879889 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.879877 2573 policy_none.go:49] "None policy: Start" Feb 17 12:46:16.879930 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.879893 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Feb 17 12:46:16.879930 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.879903 2573 state_mem.go:35] "Initializing new in-memory state store" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.916896 2573 manager.go:341] "Starting Device Plugin manager" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.916931 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.916942 2573 server.go:85] "Starting device plugin registration server" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.917265 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.917275 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.917394 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.917489 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.917497 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.918069 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.918120 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.927301 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.928428 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.928458 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.928486 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.928497 2573 kubelet.go:2451] "Starting kubelet main sync loop" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:16.928536 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.930445 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:16.943748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:16.931989 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-216.ec2.internal" not found Feb 17 12:46:17.018225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.018153 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:17.020189 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.020169 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:17.020294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.020201 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:17.020294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.020212 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:17.020294 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.020237 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.029121 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.029093 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal"] Feb 17 12:46:17.029178 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.029161 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:17.029262 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.029240 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.029297 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.029275 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-216.ec2.internal\": node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.030720 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.030703 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:17.030834 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.030730 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:17.030834 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.030742 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:17.033026 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033008 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:17.033141 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033129 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.033194 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033152 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:17.033807 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033785 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:17.033893 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033816 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:17.033893 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033832 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:17.033893 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033791 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:17.033893 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033884 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:17.034050 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.033897 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:17.036071 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.036057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.036146 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.036083 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Feb 17 12:46:17.038517 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.038501 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Feb 17 12:46:17.038593 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.038532 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Feb 17 12:46:17.038593 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.038543 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Feb 17 12:46:17.044704 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.044688 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.069955 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.069937 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.074530 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.074514 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.145706 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.145678 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.157803 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.157778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e80efaa3ce4d23555689acf7418c8107-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"e80efaa3ce4d23555689acf7418c8107\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.157872 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.157809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.157872 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.157828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.246297 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.246253 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.258570 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.258548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e80efaa3ce4d23555689acf7418c8107-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"e80efaa3ce4d23555689acf7418c8107\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.258671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.258582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.258671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.258610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.259132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.259100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.259132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.259119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e80efaa3ce4d23555689acf7418c8107-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"e80efaa3ce4d23555689acf7418c8107\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.259221 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.259121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b9a1ba4ca914d571759fd26c3b23bc5b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"b9a1ba4ca914d571759fd26c3b23bc5b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.346997 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.346924 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.372477 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.372451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.377509 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.377486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:17.448094 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.448044 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.548472 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.548432 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.648974 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.648894 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.749374 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.749343 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.764802 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.764774 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 12:46:17.764951 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.764934 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Feb 17 12:46:17.764995 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.764963 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Feb 17 12:46:17.845431 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.845382 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-02-17 12:41:16 +0000 UTC" deadline="2027-08-09 08:24:58.345596673 +0000 UTC" Feb 17 12:46:17.845431 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.845425 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12907h38m40.500174081s" Feb 17 12:46:17.849463 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.849442 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:17.856583 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.856563 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Feb 17 12:46:17.878869 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.878841 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Feb 17 12:46:17.896813 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.896787 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cg24n" Feb 17 12:46:17.905224 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.905202 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cg24n" Feb 17 12:46:17.910902 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:17.910876 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80efaa3ce4d23555689acf7418c8107.slice/crio-82e9a3a673ed7fdf27afb174bc91c67a887a3a1f419ac9ece3a1c1868c7e9090 WatchSource:0}: Error finding container 82e9a3a673ed7fdf27afb174bc91c67a887a3a1f419ac9ece3a1c1868c7e9090: Status 404 returned error can't find the container with id 82e9a3a673ed7fdf27afb174bc91c67a887a3a1f419ac9ece3a1c1868c7e9090 Feb 17 12:46:17.911480 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:17.911467 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a1ba4ca914d571759fd26c3b23bc5b.slice/crio-8dfc22bfb48dc6ab48dd78599689a499264fa3940c32cd4d8f0d9be694e4ad4e WatchSource:0}: Error finding container 8dfc22bfb48dc6ab48dd78599689a499264fa3940c32cd4d8f0d9be694e4ad4e: Status 404 returned error can't find the container with id 8dfc22bfb48dc6ab48dd78599689a499264fa3940c32cd4d8f0d9be694e4ad4e Feb 17 12:46:17.914742 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.914725 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 12:46:17.931528 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.931484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" event={"ID":"e80efaa3ce4d23555689acf7418c8107","Type":"ContainerStarted","Data":"82e9a3a673ed7fdf27afb174bc91c67a887a3a1f419ac9ece3a1c1868c7e9090"} Feb 17 12:46:17.932312 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.932291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"b9a1ba4ca914d571759fd26c3b23bc5b","Type":"ContainerStarted","Data":"8dfc22bfb48dc6ab48dd78599689a499264fa3940c32cd4d8f0d9be694e4ad4e"} Feb 17 12:46:17.947835 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:17.947813 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:17.949725 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:17.949710 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:18.050179 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.050151 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:18.150719 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.150642 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:18.251155 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.251103 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Feb 17 12:46:18.330623 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.330591 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:18.357530 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.357498 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Feb 17 12:46:18.367491 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.367211 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Feb 17 12:46:18.372180 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.372156 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Feb 17 12:46:18.381582 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.381558 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Feb 17 12:46:18.834569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.834538 2573 apiserver.go:52] "Watching apiserver" Feb 17 12:46:18.843015 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.842990 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Feb 17 12:46:18.844352 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.844328 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cnhns","openshift-network-diagnostics/network-check-target-kncvl","openshift-ovn-kubernetes/ovnkube-node-494bm","kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal","openshift-cluster-node-tuning-operator/tuned-jb42s","openshift-dns/node-resolver-4jqbk","openshift-multus/multus-additional-cni-plugins-4jlcw","openshift-multus/multus-ttlg5","openshift-network-operator/iptables-alerter-vhlqf","kube-system/konnectivity-agent-pfxld","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r","openshift-image-registry/node-ca-mdbbf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal"] Feb 17 12:46:18.846905 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.846882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.849441 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.849291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:18.849441 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.849415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:18.849597 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.849553 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9k8j8\"" Feb 17 12:46:18.849650 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.849627 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Feb 17 12:46:18.849700 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.849669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Feb 17 12:46:18.851585 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.851535 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.853692 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.853673 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.854518 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.854496 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Feb 17 12:46:18.855685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.855685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Feb 17 12:46:18.855685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Feb 17 12:46:18.855685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.855905 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855770 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bpl4q\"" Feb 17 12:46:18.855905 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.855809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.856066 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.856037 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.856240 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.856222 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rs2lg\"" Feb 17 12:46:18.856464 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.856436 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.857521 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.857477 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Feb 17 12:46:18.858231 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.858211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.858231 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.858232 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rh78h\"" Feb 17 12:46:18.858410 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.858217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.860541 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.860497 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.860637 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.860601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.863187 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.863167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:18.864332 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864311 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Feb 17 12:46:18.864430 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864331 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Feb 17 12:46:18.864430 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864315 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qdrgb\"" Feb 17 12:46:18.865648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864389 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.865648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864989 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Feb 17 12:46:18.865648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.865095 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.865648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.865242 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Feb 17 12:46:18.865648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.864311 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nqdzj\"" Feb 17 12:46:18.866219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.866304 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866246 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.866715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-tmp\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.866820 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-modprobe-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.866820 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-hosts-file\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.866923 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tdc\" (UniqueName: \"kubernetes.io/projected/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-kube-api-access-77tdc\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.866923 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867025 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.866940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovn-node-metrics-cert\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867079 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-kubernetes\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867079 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-systemd\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867195 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-host\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867195 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dmrh\" (UniqueName: \"kubernetes.io/projected/bb54e080-0e5a-47e9-bb34-5749143aff6e-kube-api-access-4dmrh\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.867284 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-log-socket\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867284 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-config\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867284 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-env-overrides\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-run\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nmj\" (UniqueName: \"kubernetes.io/projected/f4d28204-67cd-4aef-b69e-07d8309c6436-kube-api-access-l5nmj\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-cnibin\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.867506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-os-release\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.867506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867462 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Feb 17 12:46:18.867506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-node-log\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867678 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.867678 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysconfig\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.867678 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:18.868132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.867994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-systemd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.868132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868070 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2qnhh\"" Feb 17 12:46:18.868132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-lib-modules\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.868132 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.868562 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.868562 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93f0a968-7f6f-420d-9fb5-baf856136755-agent-certs\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.868562 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93f0a968-7f6f-420d-9fb5-baf856136755-konnectivity-ca\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.868720 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-netns\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.868720 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-tmp-dir\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.868720 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-system-cni-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.868901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.868901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-bin\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.868901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-netd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.868901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-script-lib\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.868901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7knq\" (UniqueName: \"kubernetes.io/projected/d39928a0-1a0f-4b0b-b327-943d7c48930d-kube-api-access-p7knq\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-conf\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.869147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.868956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-var-lib-kubelet\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.869147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.869147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-var-lib-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869326 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-etc-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869326 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.869740 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-tuned\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.869812 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-kubelet\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869812 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-systemd-units\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869812 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-slash\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869955 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869955 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-ovn\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.869955 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.869857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-sys\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.870660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.870638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:18.870748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.870724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.870826 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.870730 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:18.873035 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.873015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:18.873142 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.873058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Feb 17 12:46:18.873301 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.873195 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.873460 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.873443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.873539 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.873462 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dhnfh\"" Feb 17 12:46:18.875230 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.875211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Feb 17 12:46:18.875868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.875595 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nj8cg\"" Feb 17 12:46:18.875868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.875630 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Feb 17 12:46:18.875868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.875749 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Feb 17 12:46:18.905949 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.905880 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-02-17 12:41:17 +0000 UTC" deadline="2027-11-22 09:17:18.589002552 +0000 UTC" Feb 17 12:46:18.905949 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.905911 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15428h30m59.683095581s" Feb 17 12:46:18.958228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.958203 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Feb 17 12:46:18.970484 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970458 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-netd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970484 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-script-lib\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-os-release\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hsk\" (UniqueName: \"kubernetes.io/projected/a8056817-5e72-49a7-accb-32ae96f50dcb-kube-api-access-t4hsk\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-var-lib-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-tuned\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-netd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-cnibin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-var-lib-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-ovn\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-hosts-file\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77tdc\" (UniqueName: \"kubernetes.io/projected/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-kube-api-access-77tdc\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970692 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-ovn\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.970712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-device-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-modprobe-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-hosts-file\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw6dh\" (UniqueName: \"kubernetes.io/projected/3daec06e-ea34-4fc8-9592-ac5ec216491e-kube-api-access-zw6dh\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-registration-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-etc-selinux\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-modprobe-d\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-bin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970916 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.970972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-systemd\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-env-overrides\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-cnibin\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-os-release\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-cnibin\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.971255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-systemd\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-conf-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3daec06e-ea34-4fc8-9592-ac5ec216491e-host-slash\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-os-release\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971164 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-node-log\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-node-log\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-script-lib\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-systemd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-systemd\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93f0a968-7f6f-420d-9fb5-baf856136755-konnectivity-ca\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-tmp-dir\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971390 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-env-overrides\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-system-cni-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-socket-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.972002 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-netns\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-system-cni-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-bin\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7knq\" (UniqueName: \"kubernetes.io/projected/d39928a0-1a0f-4b0b-b327-943d7c48930d-kube-api-access-p7knq\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-conf\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-cni-bin\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysctl-conf\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-var-lib-kubelet\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-tmp-dir\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-sys-fs\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-multus\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-etc-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-var-lib-kubelet\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3daec06e-ea34-4fc8-9592-ac5ec216491e-iptables-alerter-script\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.972733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-etc-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-socket-dir-parent\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93f0a968-7f6f-420d-9fb5-baf856136755-konnectivity-ca\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.971986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-daemon-config\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-kubelet\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-systemd-units\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-slash\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-kubelet\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-systemd-units\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-slash\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-sys\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-run-openvswitch\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-tmp\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee47699-3923-4434-9f20-86ebd9785b9f-serviceca\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-hostroot\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.973502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-sys\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslr7\" (UniqueName: \"kubernetes.io/projected/ad710990-167a-49aa-bad8-faa970a4c3bb-kube-api-access-sslr7\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovn-node-metrics-cert\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-kubernetes\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-host\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972336 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dmrh\" (UniqueName: \"kubernetes.io/projected/bb54e080-0e5a-47e9-bb34-5749143aff6e-kube-api-access-4dmrh\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-kubernetes\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-system-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-host\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-log-socket\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-config\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-run\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nmj\" (UniqueName: \"kubernetes.io/projected/f4d28204-67cd-4aef-b69e-07d8309c6436-kube-api-access-l5nmj\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb54e080-0e5a-47e9-bb34-5749143aff6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972535 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82xf\" (UniqueName: \"kubernetes.io/projected/a17ee1be-195d-4b7e-8690-072cd431deef-kube-api-access-j82xf\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:18.974331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-run\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-log-socket\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee47699-3923-4434-9f20-86ebd9785b9f-host\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysconfig\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-lib-modules\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-etc-kubernetes\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972805 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93f0a968-7f6f-420d-9fb5-baf856136755-agent-certs\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-sysconfig\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-netns\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd9m2\" (UniqueName: \"kubernetes.io/projected/8ee47699-3923-4434-9f20-86ebd9785b9f-kube-api-access-pd9m2\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d39928a0-1a0f-4b0b-b327-943d7c48930d-host-run-netns\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-cni-binary-copy\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4d28204-67cd-4aef-b69e-07d8309c6436-lib-modules\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-k8s-cni-cncf-io\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovnkube-config\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-kubelet\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.972997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-multus-certs\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.973066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb54e080-0e5a-47e9-bb34-5749143aff6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.975352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-tmp\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.975425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39928a0-1a0f-4b0b-b327-943d7c48930d-ovn-node-metrics-cert\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.975503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4d28204-67cd-4aef-b69e-07d8309c6436-etc-tuned\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.975789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.975778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93f0a968-7f6f-420d-9fb5-baf856136755-agent-certs\") pod \"konnectivity-agent-pfxld\" (UID: \"93f0a968-7f6f-420d-9fb5-baf856136755\") " pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:18.976414 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.976395 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:18.976482 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.976421 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:18.976482 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.976434 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:18.976542 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:18.976509 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:19.476489489 +0000 UTC m=+3.071304311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:18.978310 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.978284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tdc\" (UniqueName: \"kubernetes.io/projected/13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55-kube-api-access-77tdc\") pod \"node-resolver-4jqbk\" (UID: \"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55\") " pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:18.982671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.982641 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dmrh\" (UniqueName: \"kubernetes.io/projected/bb54e080-0e5a-47e9-bb34-5749143aff6e-kube-api-access-4dmrh\") pod \"multus-additional-cni-plugins-4jlcw\" (UID: \"bb54e080-0e5a-47e9-bb34-5749143aff6e\") " pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:18.982959 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.982942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7knq\" (UniqueName: \"kubernetes.io/projected/d39928a0-1a0f-4b0b-b327-943d7c48930d-kube-api-access-p7knq\") pod \"ovnkube-node-494bm\" (UID: \"d39928a0-1a0f-4b0b-b327-943d7c48930d\") " pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:18.983281 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.983262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nmj\" (UniqueName: \"kubernetes.io/projected/f4d28204-67cd-4aef-b69e-07d8309c6436-kube-api-access-l5nmj\") pod \"tuned-jb42s\" (UID: \"f4d28204-67cd-4aef-b69e-07d8309c6436\") " pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:18.994039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:18.994020 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:19.074165 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw6dh\" (UniqueName: \"kubernetes.io/projected/3daec06e-ea34-4fc8-9592-ac5ec216491e-kube-api-access-zw6dh\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.074165 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-registration-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-etc-selinux\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-registration-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-etc-selinux\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-bin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-conf-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3daec06e-ea34-4fc8-9592-ac5ec216491e-host-slash\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-conf-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-bin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-socket-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3daec06e-ea34-4fc8-9592-ac5ec216491e-host-slash\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-netns\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-sys-fs\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-netns\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-multus\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-socket-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3daec06e-ea34-4fc8-9592-ac5ec216491e-iptables-alerter-script\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074601 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-cni-multus\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-sys-fs\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.074612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-socket-dir-parent\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-daemon-config\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee47699-3923-4434-9f20-86ebd9785b9f-serviceca\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.074606 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-socket-dir-parent\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074733 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-hostroot\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.074748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:19.574728904 +0000 UTC m=+3.169543734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-hostroot\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sslr7\" (UniqueName: \"kubernetes.io/projected/ad710990-167a-49aa-bad8-faa970a4c3bb-kube-api-access-sslr7\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-system-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j82xf\" (UniqueName: \"kubernetes.io/projected/a17ee1be-195d-4b7e-8690-072cd431deef-kube-api-access-j82xf\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee47699-3923-4434-9f20-86ebd9785b9f-host\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.074961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-etc-kubernetes\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd9m2\" (UniqueName: \"kubernetes.io/projected/8ee47699-3923-4434-9f20-86ebd9785b9f-kube-api-access-pd9m2\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-etc-kubernetes\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-cni-binary-copy\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-k8s-cni-cncf-io\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ee47699-3923-4434-9f20-86ebd9785b9f-host\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-kubelet\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-multus-certs\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-os-release\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ee47699-3923-4434-9f20-86ebd9785b9f-serviceca\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hsk\" (UniqueName: \"kubernetes.io/projected/a8056817-5e72-49a7-accb-32ae96f50dcb-kube-api-access-t4hsk\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3daec06e-ea34-4fc8-9592-ac5ec216491e-iptables-alerter-script\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-var-lib-kubelet\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-cnibin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-k8s-cni-cncf-io\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-device-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-system-cni-dir\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-multus-daemon-config\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-os-release\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.075997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075334 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-host-run-multus-certs\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.076641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8056817-5e72-49a7-accb-32ae96f50dcb-cnibin\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.076641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a17ee1be-195d-4b7e-8690-072cd431deef-device-dir\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.076641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.075603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8056817-5e72-49a7-accb-32ae96f50dcb-cni-binary-copy\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.084007 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.083980 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd9m2\" (UniqueName: \"kubernetes.io/projected/8ee47699-3923-4434-9f20-86ebd9785b9f-kube-api-access-pd9m2\") pod \"node-ca-mdbbf\" (UID: \"8ee47699-3923-4434-9f20-86ebd9785b9f\") " pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.084171 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.084041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw6dh\" (UniqueName: \"kubernetes.io/projected/3daec06e-ea34-4fc8-9592-ac5ec216491e-kube-api-access-zw6dh\") pod \"iptables-alerter-vhlqf\" (UID: \"3daec06e-ea34-4fc8-9592-ac5ec216491e\") " pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.084171 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.084043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82xf\" (UniqueName: \"kubernetes.io/projected/a17ee1be-195d-4b7e-8690-072cd431deef-kube-api-access-j82xf\") pod \"aws-ebs-csi-driver-node-tzn4r\" (UID: \"a17ee1be-195d-4b7e-8690-072cd431deef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.084598 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.084545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hsk\" (UniqueName: \"kubernetes.io/projected/a8056817-5e72-49a7-accb-32ae96f50dcb-kube-api-access-t4hsk\") pod \"multus-ttlg5\" (UID: \"a8056817-5e72-49a7-accb-32ae96f50dcb\") " pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.090864 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.090842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslr7\" (UniqueName: \"kubernetes.io/projected/ad710990-167a-49aa-bad8-faa970a4c3bb-kube-api-access-sslr7\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:19.158384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.158342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:19.167563 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.167542 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:19.177153 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.177128 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jb42s" Feb 17 12:46:19.181738 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.181716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4jqbk" Feb 17 12:46:19.187346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.187323 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" Feb 17 12:46:19.193966 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.193944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ttlg5" Feb 17 12:46:19.200537 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.200518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vhlqf" Feb 17 12:46:19.207139 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.207120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" Feb 17 12:46:19.211646 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.211628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mdbbf" Feb 17 12:46:19.288335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.288308 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Feb 17 12:46:19.478468 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.478399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:19.478591 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.478551 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:19.478591 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.478570 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:19.478591 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.478578 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:19.478687 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.478628 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:20.478614661 +0000 UTC m=+4.073429464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:19.537327 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.537301 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda17ee1be_195d_4b7e_8690_072cd431deef.slice/crio-2173e75e6ba9d650c5d3e573381a5676fca1151a0155a6dd6165f2341bec415d WatchSource:0}: Error finding container 2173e75e6ba9d650c5d3e573381a5676fca1151a0155a6dd6165f2341bec415d: Status 404 returned error can't find the container with id 2173e75e6ba9d650c5d3e573381a5676fca1151a0155a6dd6165f2341bec415d Feb 17 12:46:19.538995 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.538969 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd39928a0_1a0f_4b0b_b327_943d7c48930d.slice/crio-9c4c48ae4db7f45d261104b7f0feaa050136926dc2e0ec5ef61dcf0798ffec1b WatchSource:0}: Error finding container 9c4c48ae4db7f45d261104b7f0feaa050136926dc2e0ec5ef61dcf0798ffec1b: Status 404 returned error can't find the container with id 9c4c48ae4db7f45d261104b7f0feaa050136926dc2e0ec5ef61dcf0798ffec1b Feb 17 12:46:19.539873 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.539844 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb54e080_0e5a_47e9_bb34_5749143aff6e.slice/crio-fa78b9f9e7d6150d5f5ecac5c4c203b12ab867e28c1d1d907f23246092008462 WatchSource:0}: Error finding container fa78b9f9e7d6150d5f5ecac5c4c203b12ab867e28c1d1d907f23246092008462: Status 404 returned error can't find the container with id fa78b9f9e7d6150d5f5ecac5c4c203b12ab867e28c1d1d907f23246092008462 Feb 17 12:46:19.544988 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.544887 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3daec06e_ea34_4fc8_9592_ac5ec216491e.slice/crio-3a94f42efa355dbf600d550018cad9a0c7b091b6389e16a315f1061efef862fa WatchSource:0}: Error finding container 3a94f42efa355dbf600d550018cad9a0c7b091b6389e16a315f1061efef862fa: Status 404 returned error can't find the container with id 3a94f42efa355dbf600d550018cad9a0c7b091b6389e16a315f1061efef862fa Feb 17 12:46:19.546454 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.546383 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8056817_5e72_49a7_accb_32ae96f50dcb.slice/crio-f7554b9f9a2b702bae33923b794f263a26d7e4dc77f98bd0f25c25e8cf5b693e WatchSource:0}: Error finding container f7554b9f9a2b702bae33923b794f263a26d7e4dc77f98bd0f25c25e8cf5b693e: Status 404 returned error can't find the container with id f7554b9f9a2b702bae33923b794f263a26d7e4dc77f98bd0f25c25e8cf5b693e Feb 17 12:46:19.547320 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.547164 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d28204_67cd_4aef_b69e_07d8309c6436.slice/crio-ed588f9f8f70d1c06370190345e42040c9075acfa3afc74f6490edb40d709e89 WatchSource:0}: Error finding container ed588f9f8f70d1c06370190345e42040c9075acfa3afc74f6490edb40d709e89: Status 404 returned error can't find the container with id ed588f9f8f70d1c06370190345e42040c9075acfa3afc74f6490edb40d709e89 Feb 17 12:46:19.548216 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.548189 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fc6c26_7ed3_4ea9_9c4f_4317cdd2de55.slice/crio-bc6a0bf7cd39649fba18d00ecd9c45573279863cfda8f16d0acd2116090fdf4d WatchSource:0}: Error finding container bc6a0bf7cd39649fba18d00ecd9c45573279863cfda8f16d0acd2116090fdf4d: Status 404 returned error can't find the container with id bc6a0bf7cd39649fba18d00ecd9c45573279863cfda8f16d0acd2116090fdf4d Feb 17 12:46:19.548884 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.548736 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f0a968_7f6f_420d_9fb5_baf856136755.slice/crio-a4a567b49510ab938316fb856d49ec201c2b5406fc305e6b27c6b8a3e79a4bfd WatchSource:0}: Error finding container a4a567b49510ab938316fb856d49ec201c2b5406fc305e6b27c6b8a3e79a4bfd: Status 404 returned error can't find the container with id a4a567b49510ab938316fb856d49ec201c2b5406fc305e6b27c6b8a3e79a4bfd Feb 17 12:46:19.550966 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:46:19.550524 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee47699_3923_4434_9f20_86ebd9785b9f.slice/crio-13a0bb5c90c9e77a491a687bc746abf278f2b84cf0afd53b5e10df5e4e733929 WatchSource:0}: Error finding container 13a0bb5c90c9e77a491a687bc746abf278f2b84cf0afd53b5e10df5e4e733929: Status 404 returned error can't find the container with id 13a0bb5c90c9e77a491a687bc746abf278f2b84cf0afd53b5e10df5e4e733929 Feb 17 12:46:19.579232 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.579071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:19.579283 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.579207 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:19.579331 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:19.579322 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:20.579308463 +0000 UTC m=+4.174123267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:19.906387 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.906226 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-02-17 12:41:17 +0000 UTC" deadline="2027-08-27 06:55:33.977969123 +0000 UTC" Feb 17 12:46:19.906387 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.906269 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13338h9m14.071704414s" Feb 17 12:46:19.943493 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.943412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4jqbk" event={"ID":"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55","Type":"ContainerStarted","Data":"bc6a0bf7cd39649fba18d00ecd9c45573279863cfda8f16d0acd2116090fdf4d"} Feb 17 12:46:19.945897 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.945825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jb42s" event={"ID":"f4d28204-67cd-4aef-b69e-07d8309c6436","Type":"ContainerStarted","Data":"ed588f9f8f70d1c06370190345e42040c9075acfa3afc74f6490edb40d709e89"} Feb 17 12:46:19.948219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.948158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ttlg5" event={"ID":"a8056817-5e72-49a7-accb-32ae96f50dcb","Type":"ContainerStarted","Data":"f7554b9f9a2b702bae33923b794f263a26d7e4dc77f98bd0f25c25e8cf5b693e"} Feb 17 12:46:19.965935 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.965902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vhlqf" event={"ID":"3daec06e-ea34-4fc8-9592-ac5ec216491e","Type":"ContainerStarted","Data":"3a94f42efa355dbf600d550018cad9a0c7b091b6389e16a315f1061efef862fa"} Feb 17 12:46:19.969714 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.969657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerStarted","Data":"fa78b9f9e7d6150d5f5ecac5c4c203b12ab867e28c1d1d907f23246092008462"} Feb 17 12:46:19.983208 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.983177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" event={"ID":"e80efaa3ce4d23555689acf7418c8107","Type":"ContainerStarted","Data":"4acb708e13c2cdcb639c865bec65064eb2fad851ee0b1cf2b48e37680a129ee2"} Feb 17 12:46:19.996632 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.996467 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" podStartSLOduration=1.996449589 podStartE2EDuration="1.996449589s" podCreationTimestamp="2026-02-17 12:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:46:19.996011229 +0000 UTC m=+3.590826057" watchObservedRunningTime="2026-02-17 12:46:19.996449589 +0000 UTC m=+3.591264416" Feb 17 12:46:19.996781 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:19.996643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mdbbf" event={"ID":"8ee47699-3923-4434-9f20-86ebd9785b9f","Type":"ContainerStarted","Data":"13a0bb5c90c9e77a491a687bc746abf278f2b84cf0afd53b5e10df5e4e733929"} Feb 17 12:46:20.008208 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.005654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pfxld" event={"ID":"93f0a968-7f6f-420d-9fb5-baf856136755","Type":"ContainerStarted","Data":"a4a567b49510ab938316fb856d49ec201c2b5406fc305e6b27c6b8a3e79a4bfd"} Feb 17 12:46:20.011084 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.011054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"9c4c48ae4db7f45d261104b7f0feaa050136926dc2e0ec5ef61dcf0798ffec1b"} Feb 17 12:46:20.013175 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.013149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" event={"ID":"a17ee1be-195d-4b7e-8690-072cd431deef","Type":"ContainerStarted","Data":"2173e75e6ba9d650c5d3e573381a5676fca1151a0155a6dd6165f2341bec415d"} Feb 17 12:46:20.486041 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.486004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:20.486245 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.486171 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:20.486245 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.486194 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:20.486245 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.486208 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:20.486398 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.486269 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:22.486248895 +0000 UTC m=+6.081063704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:20.587876 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.587286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:20.587876 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.587442 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:20.587876 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.587510 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:22.58748971 +0000 UTC m=+6.182304530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:20.929832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.929278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:20.929832 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.929433 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:20.933062 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:20.930467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:20.933062 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:20.930582 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:21.030935 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:21.030895 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9a1ba4ca914d571759fd26c3b23bc5b" containerID="7059d86fd2fd4079539ea12f69ae4226153d4064e26bdaa44b0a3cc3eb688b06" exitCode=0 Feb 17 12:46:21.031959 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:21.031931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"b9a1ba4ca914d571759fd26c3b23bc5b","Type":"ContainerDied","Data":"7059d86fd2fd4079539ea12f69ae4226153d4064e26bdaa44b0a3cc3eb688b06"} Feb 17 12:46:22.050306 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:22.049389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"b9a1ba4ca914d571759fd26c3b23bc5b","Type":"ContainerStarted","Data":"9f7b257f92a21e5d5fd337e18af1cd3aa4534493522fd78e3f30aee97f5f9dee"} Feb 17 12:46:22.501921 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:22.501833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:22.502092 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.502016 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:22.502092 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.502036 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:22.502092 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.502048 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:22.502288 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.502148 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:26.502085553 +0000 UTC m=+10.096900375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:22.603163 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:22.603066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:22.603350 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.603251 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:22.603350 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.603331 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:26.60331018 +0000 UTC m=+10.198124985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:22.929176 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:22.929138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:22.929364 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.929300 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:22.929749 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:22.929728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:22.929853 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:22.929825 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:24.929140 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:24.929063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:24.929140 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:24.929092 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:24.929657 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:24.929230 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:24.929657 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:24.929407 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:26.537510 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:26.537395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:26.538003 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.537554 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:26.538003 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.537580 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:26.538003 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.537592 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:26.538003 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.537654 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:34.537633453 +0000 UTC m=+18.132448281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:26.638599 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:26.638019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:26.638599 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.638191 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:26.638599 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.638258 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:34.638238157 +0000 UTC m=+18.233052977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:26.932789 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:26.932758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:26.932958 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.932894 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:26.933207 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:26.933181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:26.933337 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:26.933307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:28.929702 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:28.929668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:28.930182 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:28.929799 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:28.930182 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:28.929855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:28.930182 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:28.929994 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:30.931622 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:30.931592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:30.932075 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:30.931592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:30.932075 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:30.931704 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:30.932075 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:30.931776 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:32.928729 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:32.928696 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:32.929199 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:32.928702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:32.929199 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:32.928834 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:32.929199 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:32.928958 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:34.369904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.369854 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" podStartSLOduration=16.369839579 podStartE2EDuration="16.369839579s" podCreationTimestamp="2026-02-17 12:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:46:22.068369483 +0000 UTC m=+5.663184307" watchObservedRunningTime="2026-02-17 12:46:34.369839579 +0000 UTC m=+17.964654400" Feb 17 12:46:34.370500 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.370003 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6d7mm"] Feb 17 12:46:34.397248 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.397208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.397430 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.397323 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:34.497526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.497480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-kubelet-config\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.497712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.497545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.497712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.497607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-dbus\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.598997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.598958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.598997 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.599011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.599051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-dbus\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.599090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-kubelet-config\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599146 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599224 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:46:35.099206377 +0000 UTC m=+18.694021194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599226 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599249 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599263 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.599287 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-dbus\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.599227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ba4af195-0270-4e87-a0fe-8e7fdd18175d-kubelet-config\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:34.599320 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.599314 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:46:50.599298759 +0000 UTC m=+34.194113578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:34.700279 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.700244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:34.700540 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.700390 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:34.700540 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.700465 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:46:50.70044842 +0000 UTC m=+34.295263223 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:34.929664 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.929634 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:34.929951 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.929766 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:34.929951 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:34.929819 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:34.930063 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:34.929951 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:35.103585 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:35.103488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:35.103755 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:35.103662 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:35.103755 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:35.103737 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:46:36.103718871 +0000 UTC m=+19.698533689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:35.929306 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:35.929272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:35.929783 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:35.929404 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:36.112092 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:36.112052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:36.112276 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:36.112222 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:36.112341 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:36.112297 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:46:38.112276455 +0000 UTC m=+21.707091263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:36.930468 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:36.930437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:36.930900 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:36.930556 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:36.930900 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:36.930621 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:36.930900 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:36.930685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:37.108908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:37.108677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4jqbk" event={"ID":"13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55","Type":"ContainerStarted","Data":"96a5c83cf080ff5a0fa71c628b13ae4db3daf72f69095d0acd3f61811d2b438d"} Feb 17 12:46:37.110414 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:37.110376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jb42s" event={"ID":"f4d28204-67cd-4aef-b69e-07d8309c6436","Type":"ContainerStarted","Data":"eaee25c48d295bc9770155d694505e79265ee5e6adce2220d768590e05163874"} Feb 17 12:46:37.124005 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:37.123884 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4jqbk" podStartSLOduration=2.99799407 podStartE2EDuration="20.123867503s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.550334178 +0000 UTC m=+3.145148993" lastFinishedPulling="2026-02-17 12:46:36.676207612 +0000 UTC m=+20.271022426" observedRunningTime="2026-02-17 12:46:37.123074036 +0000 UTC m=+20.717888861" watchObservedRunningTime="2026-02-17 12:46:37.123867503 +0000 UTC m=+20.718682329" Feb 17 12:46:37.138694 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:37.138645 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jb42s" podStartSLOduration=3.01174407 podStartE2EDuration="20.138628739s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.549218522 +0000 UTC m=+3.144033326" lastFinishedPulling="2026-02-17 12:46:36.676103189 +0000 UTC m=+20.270917995" observedRunningTime="2026-02-17 12:46:37.138264166 +0000 UTC m=+20.733079024" watchObservedRunningTime="2026-02-17 12:46:37.138628739 +0000 UTC m=+20.733443565" Feb 17 12:46:37.929083 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:37.928892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:37.929286 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:37.929189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:38.117848 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.117816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pfxld" event={"ID":"93f0a968-7f6f-420d-9fb5-baf856136755","Type":"ContainerStarted","Data":"f8c5f870dd4115f4e0577cf94d55b906b2bada02169ea06bb27478911386694e"} Feb 17 12:46:38.120133 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:46:38.120394 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120376 2573 generic.go:358] "Generic (PLEG): container finished" podID="d39928a0-1a0f-4b0b-b327-943d7c48930d" containerID="697bb78cdc3a56e1b2bab569efc24374caeec4d1840291b2d33dfe69154f0f97" exitCode=1 Feb 17 12:46:38.120449 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"87eaeaa2f0003118bd96c34a536616eb28953431a242785481888b7ad4f13372"} Feb 17 12:46:38.120493 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"c020b636f10ad09329c8bde81836867aff03969fb4becdb52accbfb00c5848a6"} Feb 17 12:46:38.120493 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120467 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"ac0ae4238c188ec5938052e76df660294f9ddc2d838264ae7d576f07289c949c"} Feb 17 12:46:38.120493 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"de7ae3b4f8ce86b1b6201043aba15d4b91049f287ed07e7a9ada8ed5acde7125"} Feb 17 12:46:38.120493 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"9e70fe8c9c620c2ba403bd7d471cf38c77f823c89f4e196ea795ca2d635acd58"} Feb 17 12:46:38.120627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.120499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerDied","Data":"697bb78cdc3a56e1b2bab569efc24374caeec4d1840291b2d33dfe69154f0f97"} Feb 17 12:46:38.121557 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.121534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" event={"ID":"a17ee1be-195d-4b7e-8690-072cd431deef","Type":"ContainerStarted","Data":"234921d22dbd14780348951c501b362d6be6a137f1968b84d71034c408ba0438"} Feb 17 12:46:38.122731 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.122706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ttlg5" event={"ID":"a8056817-5e72-49a7-accb-32ae96f50dcb","Type":"ContainerStarted","Data":"f6af64af5c2722db12d41f2b5a3387065d08928dac2bfa3fcd6e0e4e0d00ef6c"} Feb 17 12:46:38.124019 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.123996 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="e61bf8c8d5150881e3c056b8d868755009e28ccd5638a3a4589aeba47aa6f9cd" exitCode=0 Feb 17 12:46:38.124139 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.124079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"e61bf8c8d5150881e3c056b8d868755009e28ccd5638a3a4589aeba47aa6f9cd"} Feb 17 12:46:38.125441 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.125388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mdbbf" event={"ID":"8ee47699-3923-4434-9f20-86ebd9785b9f","Type":"ContainerStarted","Data":"d3bd4b4297915b1dbb479cf10fe169862467dee63161ee34ea59caa71d0faa02"} Feb 17 12:46:38.128315 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.128297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:38.128413 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:38.128402 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:38.128461 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:38.128445 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:46:42.128433631 +0000 UTC m=+25.723248434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:38.144946 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.144911 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mdbbf" podStartSLOduration=4.038137706 podStartE2EDuration="21.144899118s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.552991456 +0000 UTC m=+3.147806271" lastFinishedPulling="2026-02-17 12:46:36.659752876 +0000 UTC m=+20.254567683" observedRunningTime="2026-02-17 12:46:38.144819509 +0000 UTC m=+21.739634334" watchObservedRunningTime="2026-02-17 12:46:38.144899118 +0000 UTC m=+21.739713942" Feb 17 12:46:38.145456 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.145431 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pfxld" podStartSLOduration=3.892676355 podStartE2EDuration="21.14542545s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.550716654 +0000 UTC m=+3.145531464" lastFinishedPulling="2026-02-17 12:46:36.803465756 +0000 UTC m=+20.398280559" observedRunningTime="2026-02-17 12:46:38.131721789 +0000 UTC m=+21.726536608" watchObservedRunningTime="2026-02-17 12:46:38.14542545 +0000 UTC m=+21.740240274" Feb 17 12:46:38.181082 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.180999 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ttlg5" podStartSLOduration=3.903756434 podStartE2EDuration="21.18098354s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.547705866 +0000 UTC m=+3.142520686" lastFinishedPulling="2026-02-17 12:46:36.824932984 +0000 UTC m=+20.419747792" observedRunningTime="2026-02-17 12:46:38.180416277 +0000 UTC m=+21.775231104" watchObservedRunningTime="2026-02-17 12:46:38.18098354 +0000 UTC m=+21.775798365" Feb 17 12:46:38.290393 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.290365 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Feb 17 12:46:38.927378 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.927248 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-02-17T12:46:38.290387123Z","UUID":"5cebc51f-52c4-482e-a046-22a49b72d3bb","Handler":null,"Name":"","Endpoint":""} Feb 17 12:46:38.928817 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.928795 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:38.929011 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:38.928978 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:38.929200 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.929036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:38.929200 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.929090 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Feb 17 12:46:38.929200 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:38.929129 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Feb 17 12:46:38.929200 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:38.929163 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:39.129566 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:39.129511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vhlqf" event={"ID":"3daec06e-ea34-4fc8-9592-ac5ec216491e","Type":"ContainerStarted","Data":"95047274040a3a820b7716a7f831adc1234da29c7c0495c299d862acbac3b50b"} Feb 17 12:46:39.131749 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:39.131721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" event={"ID":"a17ee1be-195d-4b7e-8690-072cd431deef","Type":"ContainerStarted","Data":"3112ef3e81859f7161629f4cb6725507c6444e6813fcd5e7deb1ee101464afb2"} Feb 17 12:46:39.929779 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:39.929682 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:39.929929 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:39.929821 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:40.139036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.137918 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:46:40.139488 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.139058 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"af707f4c1be1efe51144b9d95645d1f0d6c9bf24c91fe8e9517553c9e7d3179a"} Feb 17 12:46:40.141552 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.141519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" event={"ID":"a17ee1be-195d-4b7e-8690-072cd431deef","Type":"ContainerStarted","Data":"0c5ea069b20d896a0c039a6584748f762bacd6e99ee036cf88ba49dd44e8cabe"} Feb 17 12:46:40.161579 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.161526 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tzn4r" podStartSLOduration=3.43160678 podStartE2EDuration="23.161510643s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.539217468 +0000 UTC m=+3.134032271" lastFinishedPulling="2026-02-17 12:46:39.26912132 +0000 UTC m=+22.863936134" observedRunningTime="2026-02-17 12:46:40.161350602 +0000 UTC m=+23.756165447" watchObservedRunningTime="2026-02-17 12:46:40.161510643 +0000 UTC m=+23.756325472" Feb 17 12:46:40.162146 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.162100 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vhlqf" podStartSLOduration=5.9054541480000005 podStartE2EDuration="23.16209454s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.546870563 +0000 UTC m=+3.141685375" lastFinishedPulling="2026-02-17 12:46:36.803510952 +0000 UTC m=+20.398325767" observedRunningTime="2026-02-17 12:46:39.143817204 +0000 UTC m=+22.738632030" watchObservedRunningTime="2026-02-17 12:46:40.16209454 +0000 UTC m=+23.756909365" Feb 17 12:46:40.929432 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.929229 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:40.929635 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:40.929229 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:40.929635 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:40.929536 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:40.929635 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:40.929594 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:41.928732 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:41.928702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:41.929254 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:41.928835 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:42.149841 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.149669 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:46:42.150152 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.150129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"782b464b3ae24d9f09c4faad2ba40bb6de64cf4b938e25670d598fe6d6937618"} Feb 17 12:46:42.150627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.150566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:42.150627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.150595 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:42.150627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.150608 2573 scope.go:117] "RemoveContainer" containerID="697bb78cdc3a56e1b2bab569efc24374caeec4d1840291b2d33dfe69154f0f97" Feb 17 12:46:42.153460 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.152807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerStarted","Data":"99169cae86770da079c9c6ce7092c85c3733ae0689609b904436813dc6f979fe"} Feb 17 12:46:42.158888 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.158592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:42.158888 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:42.158775 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:42.158888 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:42.158839 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:46:50.158820501 +0000 UTC m=+33.753635323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:42.167660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.167630 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:42.167757 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.167730 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:42.929301 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.929267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:42.929913 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:42.929262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:42.929913 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:42.929389 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:42.929913 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:42.929430 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:43.056186 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.056149 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:43.056817 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.056800 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:43.159150 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.159128 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:46:43.159466 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.159447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" event={"ID":"d39928a0-1a0f-4b0b-b327-943d7c48930d","Type":"ContainerStarted","Data":"52abed02ee3159e9c1bb5dbac718d70b9c0df94835b442bb64d969c20d14c0fb"} Feb 17 12:46:43.159554 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.159539 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 12:46:43.160845 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.160823 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="99169cae86770da079c9c6ce7092c85c3733ae0689609b904436813dc6f979fe" exitCode=0 Feb 17 12:46:43.160937 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.160866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"99169cae86770da079c9c6ce7092c85c3733ae0689609b904436813dc6f979fe"} Feb 17 12:46:43.161085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.161072 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:43.161556 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.161537 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pfxld" Feb 17 12:46:43.185909 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.185803 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" podStartSLOduration=8.857003057 podStartE2EDuration="26.185784156s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.542272259 +0000 UTC m=+3.137087078" lastFinishedPulling="2026-02-17 12:46:36.871053363 +0000 UTC m=+20.465868177" observedRunningTime="2026-02-17 12:46:43.184198478 +0000 UTC m=+26.779013303" watchObservedRunningTime="2026-02-17 12:46:43.185784156 +0000 UTC m=+26.780599073" Feb 17 12:46:43.928954 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.928929 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:43.929162 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:43.929038 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:43.945919 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.945885 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6d7mm"] Feb 17 12:46:43.946532 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.946512 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cnhns"] Feb 17 12:46:43.946610 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.946600 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:43.946721 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:43.946705 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:43.948280 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.948255 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kncvl"] Feb 17 12:46:43.948387 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:43.948328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:43.948421 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:43.948394 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:44.164318 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:44.164285 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="d7e1fa0efb93ee974dfbea347db152378a1d20876497ec600387685da55dffd0" exitCode=0 Feb 17 12:46:44.164482 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:44.164392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:44.164482 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:44.164398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"d7e1fa0efb93ee974dfbea347db152378a1d20876497ec600387685da55dffd0"} Feb 17 12:46:44.164596 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:44.164526 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:44.164709 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:44.164695 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 12:46:45.168442 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:45.168412 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="55ebb23d51a84390a35d0dc07fb32378b9a017a906bb7224375ec8b3ca417400" exitCode=0 Feb 17 12:46:45.168814 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:45.168486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"55ebb23d51a84390a35d0dc07fb32378b9a017a906bb7224375ec8b3ca417400"} Feb 17 12:46:45.929147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:45.928907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:45.929306 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:45.928907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:45.929306 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:45.929276 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:45.929415 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:45.929368 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:45.929415 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:45.928918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:45.929513 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:45.929445 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:46.420970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:46.420937 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:46:47.929639 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:47.929598 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:47.930148 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:47.929616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:47.930148 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:47.929851 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:47.930148 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:47.929855 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:47.930148 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:47.929976 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:47.930148 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:47.930041 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:49.929125 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:49.929078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:49.929125 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:49.929096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:49.929624 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:49.929143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:49.929624 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:49.929241 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:49.929624 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:49.929337 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:49.929624 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:49.929415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:50.217804 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:50.217728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:50.218354 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.218326 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:50.218487 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.218398 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret podName:ba4af195-0270-4e87-a0fe-8e7fdd18175d nodeName:}" failed. No retries permitted until 2026-02-17 12:47:06.218383621 +0000 UTC m=+49.813198424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret") pod "global-pull-secret-syncer-6d7mm" (UID: "ba4af195-0270-4e87-a0fe-8e7fdd18175d") : object "kube-system"/"original-pull-secret" not registered Feb 17 12:46:50.621849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:50.621762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:50.621999 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.621958 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 12:46:50.621999 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.621981 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 12:46:50.621999 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.621992 2573 projected.go:194] Error preparing data for projected volume kube-api-access-4cvxl for pod openshift-network-diagnostics/network-check-target-kncvl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:50.622091 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.622056 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl podName:52127944-2f75-482d-bab6-3694ac75b66a nodeName:}" failed. No retries permitted until 2026-02-17 12:47:22.622041475 +0000 UTC m=+66.216856277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4cvxl" (UniqueName: "kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl") pod "network-check-target-kncvl" (UID: "52127944-2f75-482d-bab6-3694ac75b66a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 12:46:50.722292 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:50.722245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:50.722471 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.722391 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:50.722471 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:50.722465 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:47:22.722448523 +0000 UTC m=+66.317263326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 12:46:51.181715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:51.181687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerStarted","Data":"6aa73f63316175679fc2ff963f4ad8fe0a34d52343749c92f48444d99db35264"} Feb 17 12:46:51.929060 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:51.929024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:51.929279 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:51.929169 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:51.929391 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:51.929368 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:51.929502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:51.929420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:51.929502 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:51.929493 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:51.929587 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:51.929559 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:52.185904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:52.185824 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="6aa73f63316175679fc2ff963f4ad8fe0a34d52343749c92f48444d99db35264" exitCode=0 Feb 17 12:46:52.185904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:52.185868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"6aa73f63316175679fc2ff963f4ad8fe0a34d52343749c92f48444d99db35264"} Feb 17 12:46:53.190475 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:53.190443 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb54e080-0e5a-47e9-bb34-5749143aff6e" containerID="959f4e866e5fe99980c235c06b10fc32dd3a6040d2569331c492e478889e0e7c" exitCode=0 Feb 17 12:46:53.190861 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:53.190483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerDied","Data":"959f4e866e5fe99980c235c06b10fc32dd3a6040d2569331c492e478889e0e7c"} Feb 17 12:46:53.929393 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:53.929363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:53.929550 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:53.929404 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:53.929550 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:53.929448 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:53.929550 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:53.929527 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:53.929683 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:53.929658 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:53.929769 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:53.929748 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:54.194437 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:54.194357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" event={"ID":"bb54e080-0e5a-47e9-bb34-5749143aff6e","Type":"ContainerStarted","Data":"8c7cb2baf3b2da5a77b2a7db592f32d8546ded5f7efaf753ed8f006e3dd01c59"} Feb 17 12:46:54.218386 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:54.218338 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4jlcw" podStartSLOduration=5.756112386 podStartE2EDuration="37.218322251s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:46:19.544647695 +0000 UTC m=+3.139462506" lastFinishedPulling="2026-02-17 12:46:51.006857568 +0000 UTC m=+34.601672371" observedRunningTime="2026-02-17 12:46:54.218040545 +0000 UTC m=+37.812855371" watchObservedRunningTime="2026-02-17 12:46:54.218322251 +0000 UTC m=+37.813137075" Feb 17 12:46:55.928795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:55.928753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:55.928795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:55.928803 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:55.929233 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:55.928878 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:55.929233 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:55.928951 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:55.929233 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:55.928933 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:55.929233 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:55.929024 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:57.928769 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:57.928732 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:57.929187 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:57.928732 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:57.929187 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:57.928853 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:46:57.929187 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:57.928909 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:57.929187 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:57.928745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:57.929187 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:57.928973 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:59.929553 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:59.929518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:46:59.929925 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:59.929519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:46:59.929925 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:59.929624 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:46:59.929925 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:46:59.929523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:46:59.929925 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:59.929702 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:46:59.929925 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:46:59.929795 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:47:01.929786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:01.929611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:01.930164 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:01.929614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:47:01.930164 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:01.929858 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:47:01.930164 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:01.929614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:01.930164 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:01.929935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:47:01.930164 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:01.930017 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:47:03.929104 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:03.929066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:03.929510 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:03.929068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:03.929510 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:03.929068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:47:03.929510 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:03.929198 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kncvl" podUID="52127944-2f75-482d-bab6-3694ac75b66a" Feb 17 12:47:03.929510 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:03.929286 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:47:03.929510 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:03.929348 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6d7mm" podUID="ba4af195-0270-4e87-a0fe-8e7fdd18175d" Feb 17 12:47:05.149051 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.149024 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeReady" Feb 17 12:47:05.149449 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.149160 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Feb 17 12:47:05.189221 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.189189 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h27xf"] Feb 17 12:47:05.219234 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.219204 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6q7rb"] Feb 17 12:47:05.219374 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.219349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.222010 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.221979 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Feb 17 12:47:05.222010 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.221987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Feb 17 12:47:05.222206 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.222042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-944wh\"" Feb 17 12:47:05.236757 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.236686 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h27xf"] Feb 17 12:47:05.236855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.236772 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6q7rb"] Feb 17 12:47:05.236855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.236773 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.239165 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.239147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qj8f\"" Feb 17 12:47:05.239277 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.239154 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Feb 17 12:47:05.239343 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.239175 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Feb 17 12:47:05.239343 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.239209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Feb 17 12:47:05.330734 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmg4\" (UniqueName: \"kubernetes.io/projected/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-kube-api-access-7fmg4\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.330906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d63493ac-401c-46c9-8e2d-344b22008d74-config-volume\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.330906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqx8h\" (UniqueName: \"kubernetes.io/projected/d63493ac-401c-46c9-8e2d-344b22008d74-kube-api-access-hqx8h\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.330906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.330906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d63493ac-401c-46c9-8e2d-344b22008d74-tmp-dir\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.330906 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.330903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.432235 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmg4\" (UniqueName: \"kubernetes.io/projected/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-kube-api-access-7fmg4\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.432235 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d63493ac-401c-46c9-8e2d-344b22008d74-config-volume\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.432235 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqx8h\" (UniqueName: \"kubernetes.io/projected/d63493ac-401c-46c9-8e2d-344b22008d74-kube-api-access-hqx8h\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.432450 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.432450 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.432331 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:05.432450 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.432385 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:05.932368717 +0000 UTC m=+49.527183519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:05.432450 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d63493ac-401c-46c9-8e2d-344b22008d74-tmp-dir\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.432640 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.432451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.432640 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.432561 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:05.432640 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.432619 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:05.932601491 +0000 UTC m=+49.527416297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:05.444499 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.444471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmg4\" (UniqueName: \"kubernetes.io/projected/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-kube-api-access-7fmg4\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.447620 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.447574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d63493ac-401c-46c9-8e2d-344b22008d74-tmp-dir\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.447777 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.447760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d63493ac-401c-46c9-8e2d-344b22008d74-config-volume\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.449477 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.449457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqx8h\" (UniqueName: \"kubernetes.io/projected/d63493ac-401c-46c9-8e2d-344b22008d74-kube-api-access-hqx8h\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.928957 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.928924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:05.929134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.928924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:05.929134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.928924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:47:05.931981 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.931962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Feb 17 12:47:05.933263 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.933240 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Feb 17 12:47:05.933386 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.933239 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Feb 17 12:47:05.933386 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.933245 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b7dtp\"" Feb 17 12:47:05.933386 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.933368 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgwh6\"" Feb 17 12:47:05.933559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.933478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Feb 17 12:47:05.935748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.935719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:05.935831 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:05.935793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:05.935887 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.935868 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:05.935938 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.935925 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:06.935909394 +0000 UTC m=+50.530724220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:05.935938 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.935929 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:05.936036 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:05.935980 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:06.935963485 +0000 UTC m=+50.530778308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:06.238408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.238332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:06.240627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.240609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ba4af195-0270-4e87-a0fe-8e7fdd18175d-original-pull-secret\") pod \"global-pull-secret-syncer-6d7mm\" (UID: \"ba4af195-0270-4e87-a0fe-8e7fdd18175d\") " pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:06.539227 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.539157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6d7mm" Feb 17 12:47:06.704706 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.704528 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6d7mm"] Feb 17 12:47:06.708215 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:47:06.708178 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4af195_0270_4e87_a0fe_8e7fdd18175d.slice/crio-960d3e7d6ab969dff1428c5ccefecb13f20ec73431c6c3c749c2ed91a3822b5e WatchSource:0}: Error finding container 960d3e7d6ab969dff1428c5ccefecb13f20ec73431c6c3c749c2ed91a3822b5e: Status 404 returned error can't find the container with id 960d3e7d6ab969dff1428c5ccefecb13f20ec73431c6c3c749c2ed91a3822b5e Feb 17 12:47:06.944311 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.944279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:06.944489 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:06.944345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:06.944489 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:06.944410 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:06.944489 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:06.944433 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:06.944489 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:06.944473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:08.944457875 +0000 UTC m=+52.539272718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:06.944489 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:06.944487 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:08.944481352 +0000 UTC m=+52.539296154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:07.218374 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:07.218290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6d7mm" event={"ID":"ba4af195-0270-4e87-a0fe-8e7fdd18175d","Type":"ContainerStarted","Data":"960d3e7d6ab969dff1428c5ccefecb13f20ec73431c6c3c749c2ed91a3822b5e"} Feb 17 12:47:08.955545 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:08.955512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:08.956136 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:08.955580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:08.956136 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:08.955676 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:08.956136 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:08.955694 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:08.956136 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:08.955744 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:12.95572762 +0000 UTC m=+56.550542438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:08.956136 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:08.955762 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:12.95575333 +0000 UTC m=+56.550568147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:11.228832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:11.228795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6d7mm" event={"ID":"ba4af195-0270-4e87-a0fe-8e7fdd18175d","Type":"ContainerStarted","Data":"2775d5e9e5fdd14c6172755e0e79f656045551f8a86554035a721caeace1ca7a"} Feb 17 12:47:11.243802 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:11.243753 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6d7mm" podStartSLOduration=33.23527775 podStartE2EDuration="37.243736809s" podCreationTimestamp="2026-02-17 12:46:34 +0000 UTC" firstStartedPulling="2026-02-17 12:47:06.709981776 +0000 UTC m=+50.304796583" lastFinishedPulling="2026-02-17 12:47:10.718440831 +0000 UTC m=+54.313255642" observedRunningTime="2026-02-17 12:47:11.243489483 +0000 UTC m=+54.838304309" watchObservedRunningTime="2026-02-17 12:47:11.243736809 +0000 UTC m=+54.838551637" Feb 17 12:47:12.983976 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:12.983937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:12.984402 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:12.983996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:12.984402 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:12.984082 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:12.984402 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:12.984088 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:12.984402 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:12.984150 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:20.984133344 +0000 UTC m=+64.578948161 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:12.984402 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:12.984164 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:20.984158089 +0000 UTC m=+64.578972892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:16.431998 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:16.431964 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-494bm" Feb 17 12:47:21.041344 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:21.041297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:21.041733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:21.041356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:21.041733 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:21.041449 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:21.041733 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:21.041453 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:21.041733 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:21.041501 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:37.041486933 +0000 UTC m=+80.636301735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:21.041733 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:21.041515 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:47:37.041509093 +0000 UTC m=+80.636323896 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:22.652993 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.652944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:22.656379 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.656357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Feb 17 12:47:22.668586 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.668568 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Feb 17 12:47:22.676470 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.676443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvxl\" (UniqueName: \"kubernetes.io/projected/52127944-2f75-482d-bab6-3694ac75b66a-kube-api-access-4cvxl\") pod \"network-check-target-kncvl\" (UID: \"52127944-2f75-482d-bab6-3694ac75b66a\") " pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:22.753889 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.753854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:47:22.755022 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.755006 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgwh6\"" Feb 17 12:47:22.756780 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.756129 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:22.756780 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.756549 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Feb 17 12:47:22.764126 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:22.764082 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 17 12:47:22.764223 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:22.764182 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:48:26.764160514 +0000 UTC m=+130.358975336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : secret "metrics-daemon-secret" not found Feb 17 12:47:22.892446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:22.892410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kncvl"] Feb 17 12:47:22.895422 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:47:22.895387 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52127944_2f75_482d_bab6_3694ac75b66a.slice/crio-1f7089013137e314014f66409351bda4f5d8a12856a69052cc3bc8dc2a150588 WatchSource:0}: Error finding container 1f7089013137e314014f66409351bda4f5d8a12856a69052cc3bc8dc2a150588: Status 404 returned error can't find the container with id 1f7089013137e314014f66409351bda4f5d8a12856a69052cc3bc8dc2a150588 Feb 17 12:47:23.251909 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:23.251868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kncvl" event={"ID":"52127944-2f75-482d-bab6-3694ac75b66a","Type":"ContainerStarted","Data":"1f7089013137e314014f66409351bda4f5d8a12856a69052cc3bc8dc2a150588"} Feb 17 12:47:26.259541 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:26.259507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kncvl" event={"ID":"52127944-2f75-482d-bab6-3694ac75b66a","Type":"ContainerStarted","Data":"fa3d75546508096a9bee1388754dd1425cf8194881d981760cfeec3013d4727d"} Feb 17 12:47:26.259923 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:26.259794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:47:26.275233 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:26.275191 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kncvl" podStartSLOduration=66.346826758 podStartE2EDuration="1m9.275179488s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:47:22.89724107 +0000 UTC m=+66.492055887" lastFinishedPulling="2026-02-17 12:47:25.82559381 +0000 UTC m=+69.420408617" observedRunningTime="2026-02-17 12:47:26.274474006 +0000 UTC m=+69.869288830" watchObservedRunningTime="2026-02-17 12:47:26.275179488 +0000 UTC m=+69.869994313" Feb 17 12:47:37.046956 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:37.046807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:47:37.046956 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:37.046869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:47:37.046956 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:37.046936 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:47:37.047721 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:37.046998 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:09.046983221 +0000 UTC m=+112.641798028 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:47:37.047721 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:37.046941 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:47:37.047721 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:47:37.047044 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:09.047032861 +0000 UTC m=+112.641847666 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:47:57.263996 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:47:57.263963 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kncvl" Feb 17 12:48:09.056486 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:09.056436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:48:09.056953 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:09.056510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:48:09.056953 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:09.056590 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Feb 17 12:48:09.056953 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:09.056604 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 17 12:48:09.056953 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:09.056672 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert podName:b0b91144-3ba6-4290-8174-1c2bdc3ca3d1 nodeName:}" failed. No retries permitted until 2026-02-17 12:49:13.056655815 +0000 UTC m=+176.651470624 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert") pod "ingress-canary-6q7rb" (UID: "b0b91144-3ba6-4290-8174-1c2bdc3ca3d1") : secret "canary-serving-cert" not found Feb 17 12:48:09.056953 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:09.056686 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls podName:d63493ac-401c-46c9-8e2d-344b22008d74 nodeName:}" failed. No retries permitted until 2026-02-17 12:49:13.056681097 +0000 UTC m=+176.651495900 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls") pod "dns-default-h27xf" (UID: "d63493ac-401c-46c9-8e2d-344b22008d74") : secret "dns-default-metrics-tls" not found Feb 17 12:48:26.776573 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:26.776534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:48:26.777010 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:26.776681 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 17 12:48:26.777010 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:26.776755 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs podName:ad710990-167a-49aa-bad8-faa970a4c3bb nodeName:}" failed. No retries permitted until 2026-02-17 12:50:28.776735308 +0000 UTC m=+252.371550128 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs") pod "network-metrics-daemon-cnhns" (UID: "ad710990-167a-49aa-bad8-faa970a4c3bb") : secret "metrics-daemon-secret" not found Feb 17 12:48:50.994008 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:50.993977 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm"] Feb 17 12:48:50.996777 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:50.996760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" Feb 17 12:48:50.999246 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:50.999225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:48:50.999467 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:50.999453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.000669 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.000602 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gqbv5\"" Feb 17 12:48:51.002027 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.002008 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5d56856ff5-ctf9v"] Feb 17 12:48:51.004876 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.004857 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm"] Feb 17 12:48:51.004986 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.004972 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.007488 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.007468 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.007580 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.007515 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5pnvw\"" Feb 17 12:48:51.007647 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.007588 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Feb 17 12:48:51.007647 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.007591 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Feb 17 12:48:51.008105 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.008088 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Feb 17 12:48:51.013019 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.013001 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Feb 17 12:48:51.014939 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.014916 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5d56856ff5-ctf9v"] Feb 17 12:48:51.093184 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.093153 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h"] Feb 17 12:48:51.095947 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.095930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.098920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.098885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Feb 17 12:48:51.098920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.098957 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:48:51.098920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.099025 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-75pc2\"" Feb 17 12:48:51.098920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.099078 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Feb 17 12:48:51.098920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.099082 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.100366 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.100343 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-5744d8689c-4b6mv"] Feb 17 12:48:51.102890 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.102866 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7"] Feb 17 12:48:51.103103 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.103020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.105741 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105719 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.105741 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" Feb 17 12:48:51.105895 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105760 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pkwdb\"" Feb 17 12:48:51.105895 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:48:51.105895 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105793 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Feb 17 12:48:51.105895 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.105772 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Feb 17 12:48:51.108483 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.108464 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-97k6k\"" Feb 17 12:48:51.111013 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.110982 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h"] Feb 17 12:48:51.113223 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.113204 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Feb 17 12:48:51.114733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.114714 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7"] Feb 17 12:48:51.115767 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.115746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-5744d8689c-4b6mv"] Feb 17 12:48:51.144587 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-tmp\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.144779 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-trusted-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.144779 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkdb\" (UniqueName: \"kubernetes.io/projected/1a8cc667-aa21-4c52-810c-330a53bdcfd3-kube-api-access-zfkdb\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.144779 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-service-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.144950 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qpq\" (UniqueName: \"kubernetes.io/projected/a75194e0-0c8c-4b2e-9d40-9622476fe327-kube-api-access-x6qpq\") pod \"volume-data-source-validator-56b878674-sddpm\" (UID: \"a75194e0-0c8c-4b2e-9d40-9622476fe327\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" Feb 17 12:48:51.144950 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-snapshots\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.144950 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.144879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8cc667-aa21-4c52-810c-330a53bdcfd3-serving-cert\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.199942 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.199914 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs"] Feb 17 12:48:51.202771 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.202755 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs"] Feb 17 12:48:51.202929 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.202908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.205482 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.205460 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Feb 17 12:48:51.205595 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.205504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Feb 17 12:48:51.205652 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.205604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xsz85\"" Feb 17 12:48:51.205652 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.205635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.207350 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.207323 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Feb 17 12:48:51.207446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.207381 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.209795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.209776 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4zr7p\"" Feb 17 12:48:51.210418 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.210396 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Feb 17 12:48:51.210519 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.210453 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Feb 17 12:48:51.210590 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.210580 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Feb 17 12:48:51.210806 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.210695 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Feb 17 12:48:51.216911 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.216807 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs"] Feb 17 12:48:51.217843 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.217822 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs"] Feb 17 12:48:51.245335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-tmp\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d6500e-0397-48cd-bf45-464b40e47782-config\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.245335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-trusted-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkdb\" (UniqueName: \"kubernetes.io/projected/1a8cc667-aa21-4c52-810c-330a53bdcfd3-kube-api-access-zfkdb\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2876k\" (UniqueName: \"kubernetes.io/projected/af5d6011-8448-486c-8483-99cdd3870524-kube-api-access-2876k\") pod \"network-check-source-5f8c4fff5b-fk4w7\" (UID: \"af5d6011-8448-486c-8483-99cdd3870524\") " pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" Feb 17 12:48:51.245559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-config\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.245712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-service-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qpq\" (UniqueName: \"kubernetes.io/projected/a75194e0-0c8c-4b2e-9d40-9622476fe327-kube-api-access-x6qpq\") pod \"volume-data-source-validator-56b878674-sddpm\" (UID: \"a75194e0-0c8c-4b2e-9d40-9622476fe327\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" Feb 17 12:48:51.245712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-tmp\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245712 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276ac3fc-41f7-4f46-8cd1-e26a91986d96-serving-cert\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992cn\" (UniqueName: \"kubernetes.io/projected/276ac3fc-41f7-4f46-8cd1-e26a91986d96-kube-api-access-992cn\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d6500e-0397-48cd-bf45-464b40e47782-serving-cert\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-snapshots\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8cc667-aa21-4c52-810c-330a53bdcfd3-serving-cert\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-trusted-ca\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.245898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.245853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmqq\" (UniqueName: \"kubernetes.io/projected/c2d6500e-0397-48cd-bf45-464b40e47782-kube-api-access-zmmqq\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.246203 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.246079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-service-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.246268 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.246257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a8cc667-aa21-4c52-810c-330a53bdcfd3-snapshots\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.246398 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.246378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a8cc667-aa21-4c52-810c-330a53bdcfd3-trusted-ca-bundle\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.248027 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.248010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8cc667-aa21-4c52-810c-330a53bdcfd3-serving-cert\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.253040 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.253020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qpq\" (UniqueName: \"kubernetes.io/projected/a75194e0-0c8c-4b2e-9d40-9622476fe327-kube-api-access-x6qpq\") pod \"volume-data-source-validator-56b878674-sddpm\" (UID: \"a75194e0-0c8c-4b2e-9d40-9622476fe327\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" Feb 17 12:48:51.253140 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.253084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkdb\" (UniqueName: \"kubernetes.io/projected/1a8cc667-aa21-4c52-810c-330a53bdcfd3-kube-api-access-zfkdb\") pod \"insights-operator-5d56856ff5-ctf9v\" (UID: \"1a8cc667-aa21-4c52-810c-330a53bdcfd3\") " pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.307024 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.306991 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" Feb 17 12:48:51.314767 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.314743 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" Feb 17 12:48:51.346338 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-992cn\" (UniqueName: \"kubernetes.io/projected/276ac3fc-41f7-4f46-8cd1-e26a91986d96-kube-api-access-992cn\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.346496 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.346540 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmtp\" (UniqueName: \"kubernetes.io/projected/d1ead7d2-81f9-4afa-8d87-188a741e9848-kube-api-access-lvmtp\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.346572 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-trusted-ca\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.346604 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmqq\" (UniqueName: \"kubernetes.io/projected/c2d6500e-0397-48cd-bf45-464b40e47782-kube-api-access-zmmqq\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.346634 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8f510bc-d548-48d5-88df-9aad16d1fee4-telemetry-config\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.346666 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.346651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2876k\" (UniqueName: \"kubernetes.io/projected/af5d6011-8448-486c-8483-99cdd3870524-kube-api-access-2876k\") pod \"network-check-source-5f8c4fff5b-fk4w7\" (UID: \"af5d6011-8448-486c-8483-99cdd3870524\") " pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" Feb 17 12:48:51.347221 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-config\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.347296 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d6500e-0397-48cd-bf45-464b40e47782-config\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.347341 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276ac3fc-41f7-4f46-8cd1-e26a91986d96-serving-cert\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.347376 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ead7d2-81f9-4afa-8d87-188a741e9848-serving-cert\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.347755 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d6500e-0397-48cd-bf45-464b40e47782-serving-cert\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.347755 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r59z\" (UniqueName: \"kubernetes.io/projected/a8f510bc-d548-48d5-88df-9aad16d1fee4-kube-api-access-5r59z\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.347755 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ead7d2-81f9-4afa-8d87-188a741e9848-config\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.347755 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-trusted-ca\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.348003 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.347807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276ac3fc-41f7-4f46-8cd1-e26a91986d96-config\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.348212 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.348159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d6500e-0397-48cd-bf45-464b40e47782-config\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.349862 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.349817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276ac3fc-41f7-4f46-8cd1-e26a91986d96-serving-cert\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.351131 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.351091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d6500e-0397-48cd-bf45-464b40e47782-serving-cert\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.355814 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.355577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2876k\" (UniqueName: \"kubernetes.io/projected/af5d6011-8448-486c-8483-99cdd3870524-kube-api-access-2876k\") pod \"network-check-source-5f8c4fff5b-fk4w7\" (UID: \"af5d6011-8448-486c-8483-99cdd3870524\") " pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" Feb 17 12:48:51.355814 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.355766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmqq\" (UniqueName: \"kubernetes.io/projected/c2d6500e-0397-48cd-bf45-464b40e47782-kube-api-access-zmmqq\") pod \"service-ca-operator-ffd9f846b-scl5h\" (UID: \"c2d6500e-0397-48cd-bf45-464b40e47782\") " pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.357036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.357010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-992cn\" (UniqueName: \"kubernetes.io/projected/276ac3fc-41f7-4f46-8cd1-e26a91986d96-kube-api-access-992cn\") pod \"console-operator-5744d8689c-4b6mv\" (UID: \"276ac3fc-41f7-4f46-8cd1-e26a91986d96\") " pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.405942 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.405911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" Feb 17 12:48:51.413716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.413681 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:48:51.420133 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.420012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" Feb 17 12:48:51.431083 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.431021 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm"] Feb 17 12:48:51.435276 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.435228 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75194e0_0c8c_4b2e_9d40_9622476fe327.slice/crio-c98018d16dd5d8de3aaa5b7e96e5eda00270ac580b5b2ea484adfe9acb4052b9 WatchSource:0}: Error finding container c98018d16dd5d8de3aaa5b7e96e5eda00270ac580b5b2ea484adfe9acb4052b9: Status 404 returned error can't find the container with id c98018d16dd5d8de3aaa5b7e96e5eda00270ac580b5b2ea484adfe9acb4052b9 Feb 17 12:48:51.448228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ead7d2-81f9-4afa-8d87-188a741e9848-config\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.448405 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.448405 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmtp\" (UniqueName: \"kubernetes.io/projected/d1ead7d2-81f9-4afa-8d87-188a741e9848-kube-api-access-lvmtp\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.448405 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8f510bc-d548-48d5-88df-9aad16d1fee4-telemetry-config\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.448568 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ead7d2-81f9-4afa-8d87-188a741e9848-serving-cert\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.448568 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.448464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r59z\" (UniqueName: \"kubernetes.io/projected/a8f510bc-d548-48d5-88df-9aad16d1fee4-kube-api-access-5r59z\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.449314 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:51.448978 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:51.449314 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:51.449063 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:51.949042262 +0000 UTC m=+155.543857080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:51.449593 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.449551 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5d56856ff5-ctf9v"] Feb 17 12:48:51.449777 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.449729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8f510bc-d548-48d5-88df-9aad16d1fee4-telemetry-config\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.449911 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.449807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ead7d2-81f9-4afa-8d87-188a741e9848-config\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.451726 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.451702 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ead7d2-81f9-4afa-8d87-188a741e9848-serving-cert\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.453102 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.453070 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8cc667_aa21_4c52_810c_330a53bdcfd3.slice/crio-2c73dafbb522bc24a94eb374059f766c859725b13972301f62862f365fe3bb8f WatchSource:0}: Error finding container 2c73dafbb522bc24a94eb374059f766c859725b13972301f62862f365fe3bb8f: Status 404 returned error can't find the container with id 2c73dafbb522bc24a94eb374059f766c859725b13972301f62862f365fe3bb8f Feb 17 12:48:51.460509 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.460441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmtp\" (UniqueName: \"kubernetes.io/projected/d1ead7d2-81f9-4afa-8d87-188a741e9848-kube-api-access-lvmtp\") pod \"kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs\" (UID: \"d1ead7d2-81f9-4afa-8d87-188a741e9848\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.462122 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.461623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r59z\" (UniqueName: \"kubernetes.io/projected/a8f510bc-d548-48d5-88df-9aad16d1fee4-kube-api-access-5r59z\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.514774 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.514745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" Feb 17 12:48:51.546829 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.546787 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h"] Feb 17 12:48:51.551293 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.551266 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d6500e_0397_48cd_bf45_464b40e47782.slice/crio-e5a9614086380f1d03ff583f9e228f3604e11667d54901c3a28c16bd120bee9a WatchSource:0}: Error finding container e5a9614086380f1d03ff583f9e228f3604e11667d54901c3a28c16bd120bee9a: Status 404 returned error can't find the container with id e5a9614086380f1d03ff583f9e228f3604e11667d54901c3a28c16bd120bee9a Feb 17 12:48:51.636259 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.636228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs"] Feb 17 12:48:51.638911 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.638884 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ead7d2_81f9_4afa_8d87_188a741e9848.slice/crio-30ec3c6ac1bd18670800f0aaa2bc23cd130299a7d70d043151187df4b33184aa WatchSource:0}: Error finding container 30ec3c6ac1bd18670800f0aaa2bc23cd130299a7d70d043151187df4b33184aa: Status 404 returned error can't find the container with id 30ec3c6ac1bd18670800f0aaa2bc23cd130299a7d70d043151187df4b33184aa Feb 17 12:48:51.767300 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.767095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7"] Feb 17 12:48:51.769651 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.769618 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf5d6011_8448_486c_8483_99cdd3870524.slice/crio-85cdf6d68efc00e24bfc210fdd2a4f2587c11f49baa063cd946abb9a5fc0acfb WatchSource:0}: Error finding container 85cdf6d68efc00e24bfc210fdd2a4f2587c11f49baa063cd946abb9a5fc0acfb: Status 404 returned error can't find the container with id 85cdf6d68efc00e24bfc210fdd2a4f2587c11f49baa063cd946abb9a5fc0acfb Feb 17 12:48:51.770151 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.770103 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-5744d8689c-4b6mv"] Feb 17 12:48:51.772452 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:51.772431 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276ac3fc_41f7_4f46_8cd1_e26a91986d96.slice/crio-e13c428dea880e54a46f7345f006783d1c399f7b1a25802045f9a35466ec1889 WatchSource:0}: Error finding container e13c428dea880e54a46f7345f006783d1c399f7b1a25802045f9a35466ec1889: Status 404 returned error can't find the container with id e13c428dea880e54a46f7345f006783d1c399f7b1a25802045f9a35466ec1889 Feb 17 12:48:51.952737 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:51.952573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:51.952920 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:51.952801 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:51.952920 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:51.952866 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:52.952845087 +0000 UTC m=+156.547659893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:52.420843 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.420803 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" event={"ID":"276ac3fc-41f7-4f46-8cd1-e26a91986d96","Type":"ContainerStarted","Data":"e13c428dea880e54a46f7345f006783d1c399f7b1a25802045f9a35466ec1889"} Feb 17 12:48:52.423400 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.423372 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" event={"ID":"a75194e0-0c8c-4b2e-9d40-9622476fe327","Type":"ContainerStarted","Data":"c98018d16dd5d8de3aaa5b7e96e5eda00270ac580b5b2ea484adfe9acb4052b9"} Feb 17 12:48:52.427697 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.427659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" event={"ID":"1a8cc667-aa21-4c52-810c-330a53bdcfd3","Type":"ContainerStarted","Data":"2c73dafbb522bc24a94eb374059f766c859725b13972301f62862f365fe3bb8f"} Feb 17 12:48:52.429854 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.429813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" event={"ID":"d1ead7d2-81f9-4afa-8d87-188a741e9848","Type":"ContainerStarted","Data":"30ec3c6ac1bd18670800f0aaa2bc23cd130299a7d70d043151187df4b33184aa"} Feb 17 12:48:52.432136 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.431769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" event={"ID":"af5d6011-8448-486c-8483-99cdd3870524","Type":"ContainerStarted","Data":"8699a7d5309eabaf3e2eec9d0271f7d2ac1b745fcff008439298e4e77c000c4c"} Feb 17 12:48:52.432136 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.431798 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" event={"ID":"af5d6011-8448-486c-8483-99cdd3870524","Type":"ContainerStarted","Data":"85cdf6d68efc00e24bfc210fdd2a4f2587c11f49baa063cd946abb9a5fc0acfb"} Feb 17 12:48:52.434098 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.434048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" event={"ID":"c2d6500e-0397-48cd-bf45-464b40e47782","Type":"ContainerStarted","Data":"e5a9614086380f1d03ff583f9e228f3604e11667d54901c3a28c16bd120bee9a"} Feb 17 12:48:52.448185 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.447980 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-5f8c4fff5b-fk4w7" podStartSLOduration=1.447963417 podStartE2EDuration="1.447963417s" podCreationTimestamp="2026-02-17 12:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:48:52.44642272 +0000 UTC m=+156.041237555" watchObservedRunningTime="2026-02-17 12:48:52.447963417 +0000 UTC m=+156.042778243" Feb 17 12:48:52.961882 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:52.961841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:52.962075 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:52.962037 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:52.962176 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:52.962102 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:54.962082842 +0000 UTC m=+158.556897650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:54.978508 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:54.978454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:54.978949 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:54.978584 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:54.978949 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:54.978647 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:48:58.978625964 +0000 UTC m=+162.573440767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:56.446766 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.446732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" event={"ID":"d1ead7d2-81f9-4afa-8d87-188a741e9848","Type":"ContainerStarted","Data":"8651a9eb76b7ec94f4afe9350de475b815883e238811c1f75668994d0240ba25"} Feb 17 12:48:56.448099 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.448070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" event={"ID":"c2d6500e-0397-48cd-bf45-464b40e47782","Type":"ContainerStarted","Data":"381643b9a63e2e1f088818effaea33dc673be79702dedab45168c62c80d4d7b0"} Feb 17 12:48:56.449544 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.449522 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/0.log" Feb 17 12:48:56.449702 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.449563 2573 generic.go:358] "Generic (PLEG): container finished" podID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" containerID="1868efe893a937e1d7d9e04752f080fd6f8e8b82c9299031f2ebc8295c1c60c0" exitCode=255 Feb 17 12:48:56.449702 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.449627 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" event={"ID":"276ac3fc-41f7-4f46-8cd1-e26a91986d96","Type":"ContainerDied","Data":"1868efe893a937e1d7d9e04752f080fd6f8e8b82c9299031f2ebc8295c1c60c0"} Feb 17 12:48:56.449822 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.449799 2573 scope.go:117] "RemoveContainer" containerID="1868efe893a937e1d7d9e04752f080fd6f8e8b82c9299031f2ebc8295c1c60c0" Feb 17 12:48:56.451093 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.451063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" event={"ID":"a75194e0-0c8c-4b2e-9d40-9622476fe327","Type":"ContainerStarted","Data":"469afac5e92802241d59eb3cdd1b2616b973aec4b6da17ec18fb1f739890f744"} Feb 17 12:48:56.452311 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.452292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" event={"ID":"1a8cc667-aa21-4c52-810c-330a53bdcfd3","Type":"ContainerStarted","Data":"00fb147f5075f57b88d511b62780747405e80fecd32542403d2f34debc46e582"} Feb 17 12:48:56.463557 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.463521 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" podStartSLOduration=1.732963916 podStartE2EDuration="5.463510361s" podCreationTimestamp="2026-02-17 12:48:51 +0000 UTC" firstStartedPulling="2026-02-17 12:48:51.640691403 +0000 UTC m=+155.235506210" lastFinishedPulling="2026-02-17 12:48:55.371237844 +0000 UTC m=+158.966052655" observedRunningTime="2026-02-17 12:48:56.462386355 +0000 UTC m=+160.057201201" watchObservedRunningTime="2026-02-17 12:48:56.463510361 +0000 UTC m=+160.058325218" Feb 17 12:48:56.490932 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.490882 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-56b878674-sddpm" podStartSLOduration=2.560456904 podStartE2EDuration="6.490864989s" podCreationTimestamp="2026-02-17 12:48:50 +0000 UTC" firstStartedPulling="2026-02-17 12:48:51.437746593 +0000 UTC m=+155.032561409" lastFinishedPulling="2026-02-17 12:48:55.368154677 +0000 UTC m=+158.962969494" observedRunningTime="2026-02-17 12:48:56.490721601 +0000 UTC m=+160.085536426" watchObservedRunningTime="2026-02-17 12:48:56.490864989 +0000 UTC m=+160.085679815" Feb 17 12:48:56.507246 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.507205 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" podStartSLOduration=2.594024277 podStartE2EDuration="6.507192142s" podCreationTimestamp="2026-02-17 12:48:50 +0000 UTC" firstStartedPulling="2026-02-17 12:48:51.455782342 +0000 UTC m=+155.050597366" lastFinishedPulling="2026-02-17 12:48:55.368950415 +0000 UTC m=+158.963765231" observedRunningTime="2026-02-17 12:48:56.506665473 +0000 UTC m=+160.101480297" watchObservedRunningTime="2026-02-17 12:48:56.507192142 +0000 UTC m=+160.102006967" Feb 17 12:48:56.523559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:56.523509 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" podStartSLOduration=1.7083315030000001 podStartE2EDuration="5.523490328s" podCreationTimestamp="2026-02-17 12:48:51 +0000 UTC" firstStartedPulling="2026-02-17 12:48:51.553247772 +0000 UTC m=+155.148062575" lastFinishedPulling="2026-02-17 12:48:55.368406589 +0000 UTC m=+158.963221400" observedRunningTime="2026-02-17 12:48:56.522393659 +0000 UTC m=+160.117208486" watchObservedRunningTime="2026-02-17 12:48:56.523490328 +0000 UTC m=+160.118305155" Feb 17 12:48:57.456319 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.456292 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/1.log" Feb 17 12:48:57.456748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.456646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/0.log" Feb 17 12:48:57.456748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.456683 2573 generic.go:358] "Generic (PLEG): container finished" podID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" exitCode=255 Feb 17 12:48:57.456848 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.456820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" event={"ID":"276ac3fc-41f7-4f46-8cd1-e26a91986d96","Type":"ContainerDied","Data":"0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31"} Feb 17 12:48:57.456926 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.456866 2573 scope.go:117] "RemoveContainer" containerID="1868efe893a937e1d7d9e04752f080fd6f8e8b82c9299031f2ebc8295c1c60c0" Feb 17 12:48:57.457498 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:57.457473 2573 scope.go:117] "RemoveContainer" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" Feb 17 12:48:57.457697 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:57.457672 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:48:58.460715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.460642 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/1.log" Feb 17 12:48:58.461288 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.461050 2573 scope.go:117] "RemoveContainer" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" Feb 17 12:48:58.461369 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:58.461294 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:48:58.573895 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.573863 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8495d7c844-5jhjj"] Feb 17 12:48:58.574164 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.574148 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4jqbk_13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55/dns-node-resolver/0.log" Feb 17 12:48:58.576685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.576668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.579448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.579426 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Feb 17 12:48:58.579448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.579435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Feb 17 12:48:58.579612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.579438 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Feb 17 12:48:58.579612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.579478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Feb 17 12:48:58.579612 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.579557 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jklq7\"" Feb 17 12:48:58.583472 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.583453 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8495d7c844-5jhjj"] Feb 17 12:48:58.609042 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.609019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-key\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.609171 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.609061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-cabundle\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.609171 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.609131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65df\" (UniqueName: \"kubernetes.io/projected/ecd00037-caa6-488d-8d6d-2b228d11821f-kube-api-access-b65df\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.710039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.710009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-key\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.710209 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.710063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-cabundle\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.710209 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.710099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b65df\" (UniqueName: \"kubernetes.io/projected/ecd00037-caa6-488d-8d6d-2b228d11821f-kube-api-access-b65df\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.710820 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.710760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-cabundle\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.712415 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.712393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ecd00037-caa6-488d-8d6d-2b228d11821f-signing-key\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.718191 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.718163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65df\" (UniqueName: \"kubernetes.io/projected/ecd00037-caa6-488d-8d6d-2b228d11821f-kube-api-access-b65df\") pod \"service-ca-8495d7c844-5jhjj\" (UID: \"ecd00037-caa6-488d-8d6d-2b228d11821f\") " pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.886538 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.886497 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" Feb 17 12:48:58.999878 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:58.999804 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8495d7c844-5jhjj"] Feb 17 12:48:59.002932 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:48:59.002909 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd00037_caa6_488d_8d6d_2b228d11821f.slice/crio-297f3f48f3b2491768cc1568298796f1a76fb7caaa01d1a8322092b9468d8e66 WatchSource:0}: Error finding container 297f3f48f3b2491768cc1568298796f1a76fb7caaa01d1a8322092b9468d8e66: Status 404 returned error can't find the container with id 297f3f48f3b2491768cc1568298796f1a76fb7caaa01d1a8322092b9468d8e66 Feb 17 12:48:59.011991 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:59.011973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:48:59.012194 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:59.012174 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:59.012277 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:48:59.012262 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:49:07.01224002 +0000 UTC m=+170.607054836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:48:59.464623 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:59.464590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" event={"ID":"ecd00037-caa6-488d-8d6d-2b228d11821f","Type":"ContainerStarted","Data":"be7c0bc68716b259867c311fa5d59c9635bc5c22fcabb9675ea403306f3f2b9e"} Feb 17 12:48:59.464980 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:59.464628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" event={"ID":"ecd00037-caa6-488d-8d6d-2b228d11821f","Type":"ContainerStarted","Data":"297f3f48f3b2491768cc1568298796f1a76fb7caaa01d1a8322092b9468d8e66"} Feb 17 12:48:59.483757 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:59.483713 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8495d7c844-5jhjj" podStartSLOduration=1.483695832 podStartE2EDuration="1.483695832s" podCreationTimestamp="2026-02-17 12:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:48:59.48368049 +0000 UTC m=+163.078495316" watchObservedRunningTime="2026-02-17 12:48:59.483695832 +0000 UTC m=+163.078510657" Feb 17 12:48:59.769448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:48:59.769376 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mdbbf_8ee47699-3923-4434-9f20-86ebd9785b9f/node-ca/0.log" Feb 17 12:49:01.414522 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:01.414482 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:01.414908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:01.414535 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:01.415009 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:01.414994 2573 scope.go:117] "RemoveContainer" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" Feb 17 12:49:01.415256 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:01.415236 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:49:07.074606 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:07.074562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:49:07.074997 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:07.074700 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 17 12:49:07.074997 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:07.074783 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls podName:a8f510bc-d548-48d5-88df-9aad16d1fee4 nodeName:}" failed. No retries permitted until 2026-02-17 12:49:23.074764409 +0000 UTC m=+186.669579212 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-565c7d9656-rhxxs" (UID: "a8f510bc-d548-48d5-88df-9aad16d1fee4") : secret "cluster-monitoring-operator-tls" not found Feb 17 12:49:08.228830 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:08.228788 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-h27xf" podUID="d63493ac-401c-46c9-8e2d-344b22008d74" Feb 17 12:49:08.244922 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:08.244891 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6q7rb" podUID="b0b91144-3ba6-4290-8174-1c2bdc3ca3d1" Feb 17 12:49:08.488653 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:08.488569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:49:08.488794 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:08.488656 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:08.949864 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:08.949815 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cnhns" podUID="ad710990-167a-49aa-bad8-faa970a4c3bb" Feb 17 12:49:13.124036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.123996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:13.124424 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.124051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:49:13.126308 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.126288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d63493ac-401c-46c9-8e2d-344b22008d74-metrics-tls\") pod \"dns-default-h27xf\" (UID: \"d63493ac-401c-46c9-8e2d-344b22008d74\") " pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:13.126462 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.126443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b91144-3ba6-4290-8174-1c2bdc3ca3d1-cert\") pod \"ingress-canary-6q7rb\" (UID: \"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1\") " pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:49:13.293891 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.293859 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-944wh\"" Feb 17 12:49:13.293891 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.293859 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qj8f\"" Feb 17 12:49:13.300172 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.300147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6q7rb" Feb 17 12:49:13.300172 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.300167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:13.442662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.442630 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h27xf"] Feb 17 12:49:13.445884 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:13.445857 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63493ac_401c_46c9_8e2d_344b22008d74.slice/crio-25c96ea0fb82009cd512b477ab5a1a1d0fa63e1830aaa968862b160eac67bb40 WatchSource:0}: Error finding container 25c96ea0fb82009cd512b477ab5a1a1d0fa63e1830aaa968862b160eac67bb40: Status 404 returned error can't find the container with id 25c96ea0fb82009cd512b477ab5a1a1d0fa63e1830aaa968862b160eac67bb40 Feb 17 12:49:13.457061 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.457037 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6q7rb"] Feb 17 12:49:13.469605 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:13.469576 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b91144_3ba6_4290_8174_1c2bdc3ca3d1.slice/crio-3546287f05c0b7c95c86a126c95b8c4a18804bc29a3a862abe8809377b68a5f6 WatchSource:0}: Error finding container 3546287f05c0b7c95c86a126c95b8c4a18804bc29a3a862abe8809377b68a5f6: Status 404 returned error can't find the container with id 3546287f05c0b7c95c86a126c95b8c4a18804bc29a3a862abe8809377b68a5f6 Feb 17 12:49:13.502349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.502317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h27xf" event={"ID":"d63493ac-401c-46c9-8e2d-344b22008d74","Type":"ContainerStarted","Data":"25c96ea0fb82009cd512b477ab5a1a1d0fa63e1830aaa968862b160eac67bb40"} Feb 17 12:49:13.503261 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:13.503240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6q7rb" event={"ID":"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1","Type":"ContainerStarted","Data":"3546287f05c0b7c95c86a126c95b8c4a18804bc29a3a862abe8809377b68a5f6"} Feb 17 12:49:15.929742 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:15.929712 2573 scope.go:117] "RemoveContainer" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" Feb 17 12:49:16.515138 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.515083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h27xf" event={"ID":"d63493ac-401c-46c9-8e2d-344b22008d74","Type":"ContainerStarted","Data":"a7796ccd5bb702da691a98c57e41422372ba73b453f5b45c7bcea82dccee8f61"} Feb 17 12:49:16.515138 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.515139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h27xf" event={"ID":"d63493ac-401c-46c9-8e2d-344b22008d74","Type":"ContainerStarted","Data":"fa7540b81418dbbfaffa495c0cf94017de8c8c49bbea8ec87c11ed9255ae576f"} Feb 17 12:49:16.515389 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.515232 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:16.516363 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.516345 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:49:16.516691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.516675 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/1.log" Feb 17 12:49:16.516778 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.516710 2573 generic.go:358] "Generic (PLEG): container finished" podID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" containerID="e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3" exitCode=255 Feb 17 12:49:16.516837 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.516778 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" event={"ID":"276ac3fc-41f7-4f46-8cd1-e26a91986d96","Type":"ContainerDied","Data":"e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3"} Feb 17 12:49:16.516837 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.516813 2573 scope.go:117] "RemoveContainer" containerID="0f5b2f2af3c7228f09c19e0d05b7295c1f6c586f4dd89c34c4a8dbfd33544e31" Feb 17 12:49:16.517097 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.517079 2573 scope.go:117] "RemoveContainer" containerID="e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3" Feb 17 12:49:16.517311 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:16.517288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:49:16.518088 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.518064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6q7rb" event={"ID":"b0b91144-3ba6-4290-8174-1c2bdc3ca3d1","Type":"ContainerStarted","Data":"6ccf96feb1467879b7188e967f03ccbb6b07f4cf152431631a29c429a510684e"} Feb 17 12:49:16.531989 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.531938 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h27xf" podStartSLOduration=129.496040754 podStartE2EDuration="2m11.531925341s" podCreationTimestamp="2026-02-17 12:47:05 +0000 UTC" firstStartedPulling="2026-02-17 12:49:13.447890034 +0000 UTC m=+177.042704836" lastFinishedPulling="2026-02-17 12:49:15.483774606 +0000 UTC m=+179.078589423" observedRunningTime="2026-02-17 12:49:16.530722066 +0000 UTC m=+180.125536891" watchObservedRunningTime="2026-02-17 12:49:16.531925341 +0000 UTC m=+180.126740165" Feb 17 12:49:16.549776 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:16.549731 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6q7rb" podStartSLOduration=129.533240659 podStartE2EDuration="2m11.549717287s" podCreationTimestamp="2026-02-17 12:47:05 +0000 UTC" firstStartedPulling="2026-02-17 12:49:13.471421402 +0000 UTC m=+177.066236205" lastFinishedPulling="2026-02-17 12:49:15.487898021 +0000 UTC m=+179.082712833" observedRunningTime="2026-02-17 12:49:16.548533916 +0000 UTC m=+180.143348742" watchObservedRunningTime="2026-02-17 12:49:16.549717287 +0000 UTC m=+180.144532161" Feb 17 12:49:17.411086 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.411050 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wwx6c"] Feb 17 12:49:17.415425 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.415404 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.418176 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.418157 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Feb 17 12:49:17.419506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.419489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Feb 17 12:49:17.419506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.419501 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hv7tp\"" Feb 17 12:49:17.425841 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.425822 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wwx6c"] Feb 17 12:49:17.451015 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.450989 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55d85f6897-jnlnq"] Feb 17 12:49:17.454262 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.454239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.456989 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.456962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Feb 17 12:49:17.456989 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.456961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68g2c\"" Feb 17 12:49:17.456989 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.456984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Feb 17 12:49:17.457251 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.456977 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Feb 17 12:49:17.461823 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.461797 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Feb 17 12:49:17.465047 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.464993 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d85f6897-jnlnq"] Feb 17 12:49:17.522934 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.522908 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:49:17.559153 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0950e3dd-d44c-43e4-a432-70d036ed1820-data-volume\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.559344 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvtf\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-kube-api-access-6qvtf\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559344 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559201 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0950e3dd-d44c-43e4-a432-70d036ed1820-crio-socket\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.559344 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-image-registry-private-configuration\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559344 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559301 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.559511 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-installation-pull-secrets\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559511 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0950e3dd-d44c-43e4-a432-70d036ed1820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.559511 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-tls\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559645 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz4zx\" (UniqueName: \"kubernetes.io/projected/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-api-access-dz4zx\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.559645 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e37820d2-72f5-4937-b1bf-d5f9263bc97c-ca-trust-extracted\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559645 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-certificates\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559645 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-bound-sa-token\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.559786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.559650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-trusted-ca\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660030 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.659990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz4zx\" (UniqueName: \"kubernetes.io/projected/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-api-access-dz4zx\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660030 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e37820d2-72f5-4937-b1bf-d5f9263bc97c-ca-trust-extracted\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-certificates\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-bound-sa-token\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-trusted-ca\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0950e3dd-d44c-43e4-a432-70d036ed1820-data-volume\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvtf\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-kube-api-access-6qvtf\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0950e3dd-d44c-43e4-a432-70d036ed1820-crio-socket\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-image-registry-private-configuration\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660309 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-installation-pull-secrets\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660752 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0950e3dd-d44c-43e4-a432-70d036ed1820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660752 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-tls\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660752 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e37820d2-72f5-4937-b1bf-d5f9263bc97c-ca-trust-extracted\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.660752 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0950e3dd-d44c-43e4-a432-70d036ed1820-crio-socket\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.660752 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.660642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0950e3dd-d44c-43e4-a432-70d036ed1820-data-volume\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.661219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.661096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.661565 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.661535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-certificates\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.661699 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.661673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37820d2-72f5-4937-b1bf-d5f9263bc97c-trusted-ca\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.663156 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.663135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-installation-pull-secrets\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.663289 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.663270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-registry-tls\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.663346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.663289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0950e3dd-d44c-43e4-a432-70d036ed1820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.663346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.663327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e37820d2-72f5-4937-b1bf-d5f9263bc97c-image-registry-private-configuration\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.669518 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.669452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz4zx\" (UniqueName: \"kubernetes.io/projected/0950e3dd-d44c-43e4-a432-70d036ed1820-kube-api-access-dz4zx\") pod \"insights-runtime-extractor-wwx6c\" (UID: \"0950e3dd-d44c-43e4-a432-70d036ed1820\") " pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.669679 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.669657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-bound-sa-token\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.669753 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.669678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvtf\" (UniqueName: \"kubernetes.io/projected/e37820d2-72f5-4937-b1bf-d5f9263bc97c-kube-api-access-6qvtf\") pod \"image-registry-55d85f6897-jnlnq\" (UID: \"e37820d2-72f5-4937-b1bf-d5f9263bc97c\") " pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.723733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.723706 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wwx6c" Feb 17 12:49:17.764072 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.764040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:17.850880 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.850848 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wwx6c"] Feb 17 12:49:17.854581 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:17.854541 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0950e3dd_d44c_43e4_a432_70d036ed1820.slice/crio-e7dd1c553cd9e6d0a3fee3c6574cdfadcb66649f23ccdb3fec9ea68d4d26ef42 WatchSource:0}: Error finding container e7dd1c553cd9e6d0a3fee3c6574cdfadcb66649f23ccdb3fec9ea68d4d26ef42: Status 404 returned error can't find the container with id e7dd1c553cd9e6d0a3fee3c6574cdfadcb66649f23ccdb3fec9ea68d4d26ef42 Feb 17 12:49:17.892450 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:17.892426 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d85f6897-jnlnq"] Feb 17 12:49:17.894821 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:17.894790 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37820d2_72f5_4937_b1bf_d5f9263bc97c.slice/crio-b0c7497968088f8901772ef0d12d229a0fa2d22f00dd7b4ac4b868e681e9cac3 WatchSource:0}: Error finding container b0c7497968088f8901772ef0d12d229a0fa2d22f00dd7b4ac4b868e681e9cac3: Status 404 returned error can't find the container with id b0c7497968088f8901772ef0d12d229a0fa2d22f00dd7b4ac4b868e681e9cac3 Feb 17 12:49:18.527527 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.527498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wwx6c" event={"ID":"0950e3dd-d44c-43e4-a432-70d036ed1820","Type":"ContainerStarted","Data":"59041d2cb1533a00844c9feaf89af14f831c3fef32c6fb089d4ce242b734b6c0"} Feb 17 12:49:18.527839 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.527537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wwx6c" event={"ID":"0950e3dd-d44c-43e4-a432-70d036ed1820","Type":"ContainerStarted","Data":"e7dd1c553cd9e6d0a3fee3c6574cdfadcb66649f23ccdb3fec9ea68d4d26ef42"} Feb 17 12:49:18.528801 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.528765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" event={"ID":"e37820d2-72f5-4937-b1bf-d5f9263bc97c","Type":"ContainerStarted","Data":"c2d5fe166d1f037518bb33ab5fc02790c0776b1e687f51cada2c804539d988f6"} Feb 17 12:49:18.528892 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.528808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" event={"ID":"e37820d2-72f5-4937-b1bf-d5f9263bc97c","Type":"ContainerStarted","Data":"b0c7497968088f8901772ef0d12d229a0fa2d22f00dd7b4ac4b868e681e9cac3"} Feb 17 12:49:18.528946 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.528929 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:18.546653 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:18.546612 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" podStartSLOduration=1.546598924 podStartE2EDuration="1.546598924s" podCreationTimestamp="2026-02-17 12:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:49:18.546020315 +0000 UTC m=+182.140835139" watchObservedRunningTime="2026-02-17 12:49:18.546598924 +0000 UTC m=+182.141413746" Feb 17 12:49:19.534036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:19.533971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wwx6c" event={"ID":"0950e3dd-d44c-43e4-a432-70d036ed1820","Type":"ContainerStarted","Data":"1e239a16092e2a09309d051a01d284f01195ddfabb85b1c3f60f579db6b33ff5"} Feb 17 12:49:19.929340 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:19.929311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:49:20.539069 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:20.539033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wwx6c" event={"ID":"0950e3dd-d44c-43e4-a432-70d036ed1820","Type":"ContainerStarted","Data":"dff4ad941278f6a175adf8b1d5eb046a6dc60609e5cedf55b3a67a6355cc1fef"} Feb 17 12:49:20.559844 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:20.559803 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wwx6c" podStartSLOduration=1.633828212 podStartE2EDuration="3.559791223s" podCreationTimestamp="2026-02-17 12:49:17 +0000 UTC" firstStartedPulling="2026-02-17 12:49:17.913752377 +0000 UTC m=+181.508567195" lastFinishedPulling="2026-02-17 12:49:19.839715388 +0000 UTC m=+183.434530206" observedRunningTime="2026-02-17 12:49:20.55830804 +0000 UTC m=+184.153122866" watchObservedRunningTime="2026-02-17 12:49:20.559791223 +0000 UTC m=+184.154606047" Feb 17 12:49:21.414051 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:21.414015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:21.414051 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:21.414053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:21.414428 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:21.414414 2573 scope.go:117] "RemoveContainer" containerID="e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3" Feb 17 12:49:21.414599 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:21.414583 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:49:23.104213 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.104150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:49:23.106524 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.106503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8f510bc-d548-48d5-88df-9aad16d1fee4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-565c7d9656-rhxxs\" (UID: \"a8f510bc-d548-48d5-88df-9aad16d1fee4\") " pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:49:23.322777 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.322740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4zr7p\"" Feb 17 12:49:23.330770 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.330748 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" Feb 17 12:49:23.442161 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.442127 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs"] Feb 17 12:49:23.445320 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:23.445292 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8f510bc_d548_48d5_88df_9aad16d1fee4.slice/crio-54bb8515e1f0c6defcb9eea6f464a47f9b9c69977bc660fe3c10296d1e5fccb5 WatchSource:0}: Error finding container 54bb8515e1f0c6defcb9eea6f464a47f9b9c69977bc660fe3c10296d1e5fccb5: Status 404 returned error can't find the container with id 54bb8515e1f0c6defcb9eea6f464a47f9b9c69977bc660fe3c10296d1e5fccb5 Feb 17 12:49:23.548928 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:23.548897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" event={"ID":"a8f510bc-d548-48d5-88df-9aad16d1fee4","Type":"ContainerStarted","Data":"54bb8515e1f0c6defcb9eea6f464a47f9b9c69977bc660fe3c10296d1e5fccb5"} Feb 17 12:49:25.556191 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:25.556104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" event={"ID":"a8f510bc-d548-48d5-88df-9aad16d1fee4","Type":"ContainerStarted","Data":"26a4a0a68b13784f12f5edb4ce7908cae2727f845b435cab8135350c4ba41fcf"} Feb 17 12:49:25.575264 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:25.575205 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-565c7d9656-rhxxs" podStartSLOduration=33.044624437 podStartE2EDuration="34.575194646s" podCreationTimestamp="2026-02-17 12:48:51 +0000 UTC" firstStartedPulling="2026-02-17 12:49:23.44709734 +0000 UTC m=+187.041912146" lastFinishedPulling="2026-02-17 12:49:24.977667541 +0000 UTC m=+188.572482355" observedRunningTime="2026-02-17 12:49:25.574012344 +0000 UTC m=+189.168827168" watchObservedRunningTime="2026-02-17 12:49:25.575194646 +0000 UTC m=+189.170009471" Feb 17 12:49:26.525549 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:26.525520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h27xf" Feb 17 12:49:32.884005 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.883964 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq"] Feb 17 12:49:32.887576 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.887556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:32.890641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.890486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Feb 17 12:49:32.890641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.890498 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Feb 17 12:49:32.890641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.890561 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Feb 17 12:49:32.890641 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.890629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6mjfv\"" Feb 17 12:49:32.896809 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.896789 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq"] Feb 17 12:49:32.903131 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.903092 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp"] Feb 17 12:49:32.906662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.906640 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.909196 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.909173 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-dngrr\"" Feb 17 12:49:32.909491 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.909470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Feb 17 12:49:32.909844 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.909823 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Feb 17 12:49:32.910376 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.910356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Feb 17 12:49:32.918948 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.918920 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zdptk"] Feb 17 12:49:32.922966 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.922929 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp"] Feb 17 12:49:32.923150 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.923134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:32.925950 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.925931 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Feb 17 12:49:32.926084 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.925946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4ggjd\"" Feb 17 12:49:32.926264 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.925984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Feb 17 12:49:32.926331 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.926014 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Feb 17 12:49:32.929995 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.929977 2573 scope.go:117] "RemoveContainer" containerID="e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3" Feb 17 12:49:32.930237 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:32.930199 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-5744d8689c-4b6mv_openshift-console-operator(276ac3fc-41f7-4f46-8cd1-e26a91986d96)\"" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podUID="276ac3fc-41f7-4f46-8cd1-e26a91986d96" Feb 17 12:49:32.982488 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-volume-directive-shadow\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.982488 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46rx\" (UniqueName: \"kubernetes.io/projected/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-api-access-n46rx\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhp58\" (UniqueName: \"kubernetes.io/projected/ea77fa6e-f053-4051-ab3d-c52ead601a19-kube-api-access-jhp58\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-metrics-client-ca\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:32.982716 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea77fa6e-f053-4051-ab3d-c52ead601a19-metrics-client-ca\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:32.982998 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:32.982811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.083775 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.083737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.083775 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.083782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.083817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-metrics-client-ca\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.083875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.084052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea77fa6e-f053-4051-ab3d-c52ead601a19-metrics-client-ca\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.084420 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084550 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jl9\" (UniqueName: \"kubernetes.io/projected/aae180f1-f47e-481b-877d-af97cf7e7caa-kube-api-access-l5jl9\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084550 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084657 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-wtmp\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084657 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-volume-directive-shadow\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084657 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-textfile\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-metrics-client-ca\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n46rx\" (UniqueName: \"kubernetes.io/projected/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-api-access-n46rx\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-root\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084731 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea77fa6e-f053-4051-ab3d-c52ead601a19-metrics-client-ca\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.084793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.084772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.085080 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:33.084979 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Feb 17 12:49:33.085080 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:33.085033 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls podName:39f8d1c9-c3cb-4a8a-a78d-a715c0f92754 nodeName:}" failed. No retries permitted until 2026-02-17 12:49:33.585017268 +0000 UTC m=+197.179832073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls") pod "kube-state-metrics-77b75dc9f9-q67mp" (UID: "39f8d1c9-c3cb-4a8a-a78d-a715c0f92754") : secret "kube-state-metrics-tls" not found Feb 17 12:49:33.085348 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.085485 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-sys\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.085578 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-volume-directive-shadow\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.085661 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-metrics-client-ca\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.085780 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhp58\" (UniqueName: \"kubernetes.io/projected/ea77fa6e-f053-4051-ab3d-c52ead601a19-kube-api-access-jhp58\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.086220 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.085959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.087037 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.087017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.087815 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.087781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea77fa6e-f053-4051-ab3d-c52ead601a19-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.087908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.087872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.095939 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.095916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46rx\" (UniqueName: \"kubernetes.io/projected/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-api-access-n46rx\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.097119 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.097087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhp58\" (UniqueName: \"kubernetes.io/projected/ea77fa6e-f053-4051-ab3d-c52ead601a19-kube-api-access-jhp58\") pod \"openshift-state-metrics-6b69bf8d6b-74lsq\" (UID: \"ea77fa6e-f053-4051-ab3d-c52ead601a19\") " pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.187157 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jl9\" (UniqueName: \"kubernetes.io/projected/aae180f1-f47e-481b-877d-af97cf7e7caa-kube-api-access-l5jl9\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-wtmp\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-textfile\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-root\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-sys\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-metrics-client-ca\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187741 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:33.187563 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Feb 17 12:49:33.187741 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-sys\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187741 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:49:33.187621 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls podName:aae180f1-f47e-481b-877d-af97cf7e7caa nodeName:}" failed. No retries permitted until 2026-02-17 12:49:33.687603042 +0000 UTC m=+197.282417852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls") pod "node-exporter-zdptk" (UID: "aae180f1-f47e-481b-877d-af97cf7e7caa") : secret "node-exporter-tls" not found Feb 17 12:49:33.187741 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-textfile\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.187741 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.187669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-root\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.188177 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.188156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-metrics-client-ca\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.188288 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.188267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.188350 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.188320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-wtmp\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.189649 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.189628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.197594 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.197572 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" Feb 17 12:49:33.340059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.340032 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq"] Feb 17 12:49:33.342345 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:33.342317 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea77fa6e_f053_4051_ab3d_c52ead601a19.slice/crio-9eda4256c985c8cdf9a678fdc482a8d9ef86c2aa2fb4d042670019ef6ba12c56 WatchSource:0}: Error finding container 9eda4256c985c8cdf9a678fdc482a8d9ef86c2aa2fb4d042670019ef6ba12c56: Status 404 returned error can't find the container with id 9eda4256c985c8cdf9a678fdc482a8d9ef86c2aa2fb4d042670019ef6ba12c56 Feb 17 12:49:33.397284 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.397258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jl9\" (UniqueName: \"kubernetes.io/projected/aae180f1-f47e-481b-877d-af97cf7e7caa-kube-api-access-l5jl9\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.577010 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.576975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" event={"ID":"ea77fa6e-f053-4051-ab3d-c52ead601a19","Type":"ContainerStarted","Data":"2da63d3a6f2fa5336a03dd8c6378bb6de707d39220aff9842f2b1e8f3a23a4e9"} Feb 17 12:49:33.577010 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.577012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" event={"ID":"ea77fa6e-f053-4051-ab3d-c52ead601a19","Type":"ContainerStarted","Data":"c86cf29c9e3e134ca530665a471d41babc09a962fe6396c3e905ecc4888e964c"} Feb 17 12:49:33.577223 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.577022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" event={"ID":"ea77fa6e-f053-4051-ab3d-c52ead601a19","Type":"ContainerStarted","Data":"9eda4256c985c8cdf9a678fdc482a8d9ef86c2aa2fb4d042670019ef6ba12c56"} Feb 17 12:49:33.590524 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.590486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.592964 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.592944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39f8d1c9-c3cb-4a8a-a78d-a715c0f92754-kube-state-metrics-tls\") pod \"kube-state-metrics-77b75dc9f9-q67mp\" (UID: \"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754\") " pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.691795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.691752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.694023 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.693998 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae180f1-f47e-481b-877d-af97cf7e7caa-node-exporter-tls\") pod \"node-exporter-zdptk\" (UID: \"aae180f1-f47e-481b-877d-af97cf7e7caa\") " pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.817293 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.817204 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" Feb 17 12:49:33.836165 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.836136 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zdptk" Feb 17 12:49:33.846529 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:33.846496 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae180f1_f47e_481b_877d_af97cf7e7caa.slice/crio-f306a112a18bde3108dcb2ae73ec74359125742eef731688cfde648eed8095be WatchSource:0}: Error finding container f306a112a18bde3108dcb2ae73ec74359125742eef731688cfde648eed8095be: Status 404 returned error can't find the container with id f306a112a18bde3108dcb2ae73ec74359125742eef731688cfde648eed8095be Feb 17 12:49:33.964327 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:33.964289 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp"] Feb 17 12:49:33.968302 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:33.968271 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f8d1c9_c3cb_4a8a_a78d_a715c0f92754.slice/crio-0cfc61ccfbdc3369e04d319360068c5bf7fb21cf121df87ffa89f9cb32146d1a WatchSource:0}: Error finding container 0cfc61ccfbdc3369e04d319360068c5bf7fb21cf121df87ffa89f9cb32146d1a: Status 404 returned error can't find the container with id 0cfc61ccfbdc3369e04d319360068c5bf7fb21cf121df87ffa89f9cb32146d1a Feb 17 12:49:34.583533 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:34.583476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" event={"ID":"ea77fa6e-f053-4051-ab3d-c52ead601a19","Type":"ContainerStarted","Data":"267aed8df24b149492677a3b0593bedfb11424c4a7483b657af0bdf19ea21033"} Feb 17 12:49:34.586280 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:34.585495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" event={"ID":"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754","Type":"ContainerStarted","Data":"0cfc61ccfbdc3369e04d319360068c5bf7fb21cf121df87ffa89f9cb32146d1a"} Feb 17 12:49:34.587026 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:34.587000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdptk" event={"ID":"aae180f1-f47e-481b-877d-af97cf7e7caa","Type":"ContainerStarted","Data":"f306a112a18bde3108dcb2ae73ec74359125742eef731688cfde648eed8095be"} Feb 17 12:49:34.600964 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:34.600796 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-6b69bf8d6b-74lsq" podStartSLOduration=1.6720885380000001 podStartE2EDuration="2.600776418s" podCreationTimestamp="2026-02-17 12:49:32 +0000 UTC" firstStartedPulling="2026-02-17 12:49:33.481407664 +0000 UTC m=+197.076222468" lastFinishedPulling="2026-02-17 12:49:34.410095538 +0000 UTC m=+198.004910348" observedRunningTime="2026-02-17 12:49:34.599429787 +0000 UTC m=+198.194244614" watchObservedRunningTime="2026-02-17 12:49:34.600776418 +0000 UTC m=+198.195591252" Feb 17 12:49:35.591904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.591816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" event={"ID":"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754","Type":"ContainerStarted","Data":"161c6e585dc161dcb215e270f7fdf6169c54c44f7b5180273366c6d9e395c25a"} Feb 17 12:49:35.591904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.591854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" event={"ID":"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754","Type":"ContainerStarted","Data":"4b784eccd30763cad0583af357b819208f4cd409b1a76d04f07270ad80ec4117"} Feb 17 12:49:35.591904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.591868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" event={"ID":"39f8d1c9-c3cb-4a8a-a78d-a715c0f92754","Type":"ContainerStarted","Data":"7052550a40db995226b4bfec74786957358e8e9f6dfb6899e71f39e17aaecabc"} Feb 17 12:49:35.593370 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.593339 2573 generic.go:358] "Generic (PLEG): container finished" podID="aae180f1-f47e-481b-877d-af97cf7e7caa" containerID="c21540aa0acbaaa4859ef3f827217d091c426681b85b3d96ebef973d7d9b4201" exitCode=0 Feb 17 12:49:35.593502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.593385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdptk" event={"ID":"aae180f1-f47e-481b-877d-af97cf7e7caa","Type":"ContainerDied","Data":"c21540aa0acbaaa4859ef3f827217d091c426681b85b3d96ebef973d7d9b4201"} Feb 17 12:49:35.635617 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:35.635564 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-77b75dc9f9-q67mp" podStartSLOduration=2.371650445 podStartE2EDuration="3.635544804s" podCreationTimestamp="2026-02-17 12:49:32 +0000 UTC" firstStartedPulling="2026-02-17 12:49:33.970511872 +0000 UTC m=+197.565326682" lastFinishedPulling="2026-02-17 12:49:35.234406235 +0000 UTC m=+198.829221041" observedRunningTime="2026-02-17 12:49:35.611024569 +0000 UTC m=+199.205839394" watchObservedRunningTime="2026-02-17 12:49:35.635544804 +0000 UTC m=+199.230359630" Feb 17 12:49:36.009754 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.009715 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l"] Feb 17 12:49:36.013626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.013605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.016318 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016256 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Feb 17 12:49:36.016318 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Feb 17 12:49:36.016318 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Feb 17 12:49:36.016569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016324 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9kvqdaakqgp5d\"" Feb 17 12:49:36.016569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016333 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wl9nq\"" Feb 17 12:49:36.016569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Feb 17 12:49:36.016569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.016285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Feb 17 12:49:36.024432 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.024406 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l"] Feb 17 12:49:36.111438 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111602 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111602 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-metrics-client-ca\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111602 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfckc\" (UniqueName: \"kubernetes.io/projected/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-kube-api-access-lfckc\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111718 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111718 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111718 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-grpc-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.111849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.111724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.212827 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.212793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfckc\" (UniqueName: \"kubernetes.io/projected/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-kube-api-access-lfckc\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.212995 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.212837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.212995 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.212948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.212995 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.212987 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-grpc-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.213198 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.213018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.213198 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.213078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.213198 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.213160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.213198 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.213184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-metrics-client-ca\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.213931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.213908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-metrics-client-ca\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215500 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215594 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215799 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215913 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-grpc-tls\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215957 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.215991 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.215965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.220190 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.220170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfckc\" (UniqueName: \"kubernetes.io/projected/fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36-kube-api-access-lfckc\") pod \"thanos-querier-7ff677b5f6-qjb5l\" (UID: \"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36\") " pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.325135 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.325047 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:36.443349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.443316 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l"] Feb 17 12:49:36.446331 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:36.446302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0c3e07_4e2f_4ec8_a097_8e723c5fcf36.slice/crio-9a5fe6637094d4ab8e3b915705a8e3c5ef93347a77b29bb51f5f161709fe9d2a WatchSource:0}: Error finding container 9a5fe6637094d4ab8e3b915705a8e3c5ef93347a77b29bb51f5f161709fe9d2a: Status 404 returned error can't find the container with id 9a5fe6637094d4ab8e3b915705a8e3c5ef93347a77b29bb51f5f161709fe9d2a Feb 17 12:49:36.597018 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.596925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"9a5fe6637094d4ab8e3b915705a8e3c5ef93347a77b29bb51f5f161709fe9d2a"} Feb 17 12:49:36.598862 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.598826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdptk" event={"ID":"aae180f1-f47e-481b-877d-af97cf7e7caa","Type":"ContainerStarted","Data":"93d886cfbdab680be63bd8e24ad3137475542e4caff03dcc612850b75a5db353"} Feb 17 12:49:36.598862 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.598866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdptk" event={"ID":"aae180f1-f47e-481b-877d-af97cf7e7caa","Type":"ContainerStarted","Data":"5490d3287ff34eab352bbd4a935f9070b78b61af08811811e9efc92b1f916894"} Feb 17 12:49:36.622412 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:36.622362 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zdptk" podStartSLOduration=3.279009966 podStartE2EDuration="4.622348099s" podCreationTimestamp="2026-02-17 12:49:32 +0000 UTC" firstStartedPulling="2026-02-17 12:49:33.848924279 +0000 UTC m=+197.443739081" lastFinishedPulling="2026-02-17 12:49:35.192262412 +0000 UTC m=+198.787077214" observedRunningTime="2026-02-17 12:49:36.621004539 +0000 UTC m=+200.215819365" watchObservedRunningTime="2026-02-17 12:49:36.622348099 +0000 UTC m=+200.217162925" Feb 17 12:49:38.098353 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.098320 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75748fc6cd-bwhxw"] Feb 17 12:49:38.102123 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.102091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.104920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.104724 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Feb 17 12:49:38.104920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.104756 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Feb 17 12:49:38.104920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.104764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Feb 17 12:49:38.104920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.104786 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Feb 17 12:49:38.104920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.104870 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-n5kr6\"" Feb 17 12:49:38.105252 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.105043 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Feb 17 12:49:38.111593 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.110298 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Feb 17 12:49:38.117157 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.117136 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75748fc6cd-bwhxw"] Feb 17 12:49:38.230312 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230501 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljb6q\" (UniqueName: \"kubernetes.io/projected/21ba913d-e137-4041-8efc-9da24c250805-kube-api-access-ljb6q\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230501 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-metrics-client-ca\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230501 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230501 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-serving-certs-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230501 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-federate-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230744 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-telemeter-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.230744 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.230619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.331970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.331936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-federate-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.331970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.331984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-telemeter-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljb6q\" (UniqueName: \"kubernetes.io/projected/21ba913d-e137-4041-8efc-9da24c250805-kube-api-access-ljb6q\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332464 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-metrics-client-ca\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332518 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.332572 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.332539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-serving-certs-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.333149 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.333099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-metrics-client-ca\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.333749 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.333724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-serving-certs-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.334011 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.333984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ba913d-e137-4041-8efc-9da24c250805-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.334661 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.334627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-federate-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.334957 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.334916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.335040 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.334963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.335040 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.334970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/21ba913d-e137-4041-8efc-9da24c250805-telemeter-client-tls\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.341313 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.341278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljb6q\" (UniqueName: \"kubernetes.io/projected/21ba913d-e137-4041-8efc-9da24c250805-kube-api-access-ljb6q\") pod \"telemeter-client-75748fc6cd-bwhxw\" (UID: \"21ba913d-e137-4041-8efc-9da24c250805\") " pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.419090 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.419061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" Feb 17 12:49:38.591054 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.591024 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75748fc6cd-bwhxw"] Feb 17 12:49:38.596890 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:38.596859 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ba913d_e137_4041_8efc_9da24c250805.slice/crio-49e1c7c99ecb6bb550443830658898d3668e6f7c1b22e34f5d9cff174a278b1e WatchSource:0}: Error finding container 49e1c7c99ecb6bb550443830658898d3668e6f7c1b22e34f5d9cff174a278b1e: Status 404 returned error can't find the container with id 49e1c7c99ecb6bb550443830658898d3668e6f7c1b22e34f5d9cff174a278b1e Feb 17 12:49:38.607017 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.606987 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" event={"ID":"21ba913d-e137-4041-8efc-9da24c250805","Type":"ContainerStarted","Data":"49e1c7c99ecb6bb550443830658898d3668e6f7c1b22e34f5d9cff174a278b1e"} Feb 17 12:49:38.608853 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.608824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"f7e9c50f94cb890e9f1886daf6e9884fb12406385aebc06feb3e9a78fc4d499f"} Feb 17 12:49:38.608989 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:38.608855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"388ba431e034d1fa6d41765b784c9957e2d9b5387f9cbaed783c14cf3d41c2f7"} Feb 17 12:49:39.540622 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.540566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55d85f6897-jnlnq" Feb 17 12:49:39.615894 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.615777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"6e5fb294187d6f68daf92743801af32bacaa8fdf188286fb9dffe4a037943baf"} Feb 17 12:49:39.615894 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.615826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"f71668c53b0884b38316aafebd0cfccf4cd1645fc8493a43e1d5054e6fefb0e0"} Feb 17 12:49:39.615894 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.615849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"cf4739e1b27fd2804407988b1d604f661d5e3923f46461f8c5e9711d4b53fbbe"} Feb 17 12:49:39.615894 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.615863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" event={"ID":"fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36","Type":"ContainerStarted","Data":"b540904e59e82d82c14a1a15a7f48302d4ee26a0507fd4259fb3b0bc010dc869"} Feb 17 12:49:39.616205 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.616187 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:39.662153 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:39.662078 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" podStartSLOduration=1.7467761510000002 podStartE2EDuration="4.662060265s" podCreationTimestamp="2026-02-17 12:49:35 +0000 UTC" firstStartedPulling="2026-02-17 12:49:36.448321775 +0000 UTC m=+200.043136577" lastFinishedPulling="2026-02-17 12:49:39.363605886 +0000 UTC m=+202.958420691" observedRunningTime="2026-02-17 12:49:39.660470982 +0000 UTC m=+203.255285806" watchObservedRunningTime="2026-02-17 12:49:39.662060265 +0000 UTC m=+203.256875094" Feb 17 12:49:40.619633 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:40.619552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" event={"ID":"21ba913d-e137-4041-8efc-9da24c250805","Type":"ContainerStarted","Data":"705c7d3d4b926fcc0b8f6b47ed2002724964f61144244b278d2da26ccdab9d73"} Feb 17 12:49:41.624448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:41.624361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" event={"ID":"21ba913d-e137-4041-8efc-9da24c250805","Type":"ContainerStarted","Data":"faffe558a28c2fa11a9d4d7b2bddb554206a14857dc1d2fb0fa35bd2fd6a27d1"} Feb 17 12:49:41.624448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:41.624398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" event={"ID":"21ba913d-e137-4041-8efc-9da24c250805","Type":"ContainerStarted","Data":"05b02061a40fd7bd5c5b66c6a4c7ef597775b9e7dbdace8bb4900dcef1a420c6"} Feb 17 12:49:41.657731 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:41.657676 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75748fc6cd-bwhxw" podStartSLOduration=1.071388005 podStartE2EDuration="3.65765859s" podCreationTimestamp="2026-02-17 12:49:38 +0000 UTC" firstStartedPulling="2026-02-17 12:49:38.599093634 +0000 UTC m=+202.193908436" lastFinishedPulling="2026-02-17 12:49:41.185364217 +0000 UTC m=+204.780179021" observedRunningTime="2026-02-17 12:49:41.656637434 +0000 UTC m=+205.251452259" watchObservedRunningTime="2026-02-17 12:49:41.65765859 +0000 UTC m=+205.252473439" Feb 17 12:49:45.625604 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:45.625575 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7ff677b5f6-qjb5l" Feb 17 12:49:47.929291 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:47.929257 2573 scope.go:117] "RemoveContainer" containerID="e958cb0d42f464b45838c76cf07f395719f1bc803a4397047de55f9a977be5f3" Feb 17 12:49:48.645826 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.645799 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:49:48.646007 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.645868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" event={"ID":"276ac3fc-41f7-4f46-8cd1-e26a91986d96","Type":"ContainerStarted","Data":"350e53686ab82e01cae714891a94f148cdaf1372f2de63c2abe09ce7c59d6159"} Feb 17 12:49:48.646171 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.646150 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:48.650802 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.650779 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" Feb 17 12:49:48.665036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.664968 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-5744d8689c-4b6mv" podStartSLOduration=54.063234076 podStartE2EDuration="57.664950725s" podCreationTimestamp="2026-02-17 12:48:51 +0000 UTC" firstStartedPulling="2026-02-17 12:48:51.774274488 +0000 UTC m=+155.369089291" lastFinishedPulling="2026-02-17 12:48:55.375991134 +0000 UTC m=+158.970805940" observedRunningTime="2026-02-17 12:49:48.66359692 +0000 UTC m=+212.258411746" watchObservedRunningTime="2026-02-17 12:49:48.664950725 +0000 UTC m=+212.259765582" Feb 17 12:49:48.811052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.811018 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-58b949d66d-pn5bm"] Feb 17 12:49:48.814483 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.814463 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:49:48.817348 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.817324 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2b4k9\"" Feb 17 12:49:48.817453 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.817329 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Feb 17 12:49:48.817453 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.817329 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Feb 17 12:49:48.824025 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.823992 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-58b949d66d-pn5bm"] Feb 17 12:49:48.925970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:48.925877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb6nj\" (UniqueName: \"kubernetes.io/projected/056c76cf-5dfe-4900-898b-551996c808b0-kube-api-access-vb6nj\") pod \"downloads-58b949d66d-pn5bm\" (UID: \"056c76cf-5dfe-4900-898b-551996c808b0\") " pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:49:49.026711 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:49.026672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb6nj\" (UniqueName: \"kubernetes.io/projected/056c76cf-5dfe-4900-898b-551996c808b0-kube-api-access-vb6nj\") pod \"downloads-58b949d66d-pn5bm\" (UID: \"056c76cf-5dfe-4900-898b-551996c808b0\") " pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:49:49.034867 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:49.034830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb6nj\" (UniqueName: \"kubernetes.io/projected/056c76cf-5dfe-4900-898b-551996c808b0-kube-api-access-vb6nj\") pod \"downloads-58b949d66d-pn5bm\" (UID: \"056c76cf-5dfe-4900-898b-551996c808b0\") " pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:49:49.124350 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:49.124309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:49:49.240227 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:49.240194 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-58b949d66d-pn5bm"] Feb 17 12:49:49.242765 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:49.242730 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056c76cf_5dfe_4900_898b_551996c808b0.slice/crio-1bcd982b1a4fb7d52ea26077e5a6b9573cd614d43bbdf4cbf339fc84104411f7 WatchSource:0}: Error finding container 1bcd982b1a4fb7d52ea26077e5a6b9573cd614d43bbdf4cbf339fc84104411f7: Status 404 returned error can't find the container with id 1bcd982b1a4fb7d52ea26077e5a6b9573cd614d43bbdf4cbf339fc84104411f7 Feb 17 12:49:49.650068 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:49.650037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-58b949d66d-pn5bm" event={"ID":"056c76cf-5dfe-4900-898b-551996c808b0","Type":"ContainerStarted","Data":"1bcd982b1a4fb7d52ea26077e5a6b9573cd614d43bbdf4cbf339fc84104411f7"} Feb 17 12:49:57.859724 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.859688 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:49:57.864158 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.864135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.866872 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.866840 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cs2ql\"" Feb 17 12:49:57.867091 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.866840 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Feb 17 12:49:57.868073 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.868048 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Feb 17 12:49:57.868194 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.868092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Feb 17 12:49:57.868194 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.868091 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Feb 17 12:49:57.868194 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.868094 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Feb 17 12:49:57.872407 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.872320 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:49:57.910648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.910808 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.910808 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910715 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.910931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.910931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2q5\" (UniqueName: \"kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:57.911022 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:57.910976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2q5\" (UniqueName: \"kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012436 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012494 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012544 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.012847 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.012820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.013072 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.013047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.013228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.013207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.015332 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.015313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.015544 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.015525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.020559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.020536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2q5\" (UniqueName: \"kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5\") pod \"console-6bf55cdf66-fmlfb\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.177735 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.177693 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:49:58.316381 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.316357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:49:58.318816 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:49:58.318735 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e63218_57aa_461f_be09_a74c47b5b491.slice/crio-b55ab9758245d9847ab5616dd7064060eea00852d7c8f171e61ec3dcbd7c2d0c WatchSource:0}: Error finding container b55ab9758245d9847ab5616dd7064060eea00852d7c8f171e61ec3dcbd7c2d0c: Status 404 returned error can't find the container with id b55ab9758245d9847ab5616dd7064060eea00852d7c8f171e61ec3dcbd7c2d0c Feb 17 12:49:58.679668 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:49:58.679589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf55cdf66-fmlfb" event={"ID":"e4e63218-57aa-461f-be09-a74c47b5b491","Type":"ContainerStarted","Data":"b55ab9758245d9847ab5616dd7064060eea00852d7c8f171e61ec3dcbd7c2d0c"} Feb 17 12:50:01.691870 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:01.691829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf55cdf66-fmlfb" event={"ID":"e4e63218-57aa-461f-be09-a74c47b5b491","Type":"ContainerStarted","Data":"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507"} Feb 17 12:50:01.709661 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:01.709617 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bf55cdf66-fmlfb" podStartSLOduration=1.787940182 podStartE2EDuration="4.709602321s" podCreationTimestamp="2026-02-17 12:49:57 +0000 UTC" firstStartedPulling="2026-02-17 12:49:58.321007353 +0000 UTC m=+221.915822171" lastFinishedPulling="2026-02-17 12:50:01.242669507 +0000 UTC m=+224.837484310" observedRunningTime="2026-02-17 12:50:01.707708636 +0000 UTC m=+225.302523461" watchObservedRunningTime="2026-02-17 12:50:01.709602321 +0000 UTC m=+225.304417146" Feb 17 12:50:02.734626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.734590 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:50:02.738534 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.738497 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.748854 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.748825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Feb 17 12:50:02.750062 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.750044 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:50:02.861988 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.861946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862186 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862186 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862062 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862186 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862335 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862237 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.862427 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.862338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l7xf\" (UniqueName: \"kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963516 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l7xf\" (UniqueName: \"kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.963933 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.963711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.964436 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.964404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.964650 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.964623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.964731 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.964681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.964846 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.964825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.966861 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.966835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.966962 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.966895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:02.971809 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:02.971790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l7xf\" (UniqueName: \"kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf\") pod \"console-54ff64c78c-cq6lm\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:03.050662 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:03.050568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:08.178506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:08.178472 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:08.178950 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:08.178573 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:08.184085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:08.184062 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:08.719036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:08.719005 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:10.927466 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:10.927431 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:50:10.930065 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:50:10.930036 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d12be6_04d7_4fff_87ef_c2c7c4c18f58.slice/crio-55cca42129beaf651d3b9df32136a5b6d5452ceab39adad36d756aa1f89ed7c6 WatchSource:0}: Error finding container 55cca42129beaf651d3b9df32136a5b6d5452ceab39adad36d756aa1f89ed7c6: Status 404 returned error can't find the container with id 55cca42129beaf651d3b9df32136a5b6d5452ceab39adad36d756aa1f89ed7c6 Feb 17 12:50:11.725506 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.725469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-58b949d66d-pn5bm" event={"ID":"056c76cf-5dfe-4900-898b-551996c808b0","Type":"ContainerStarted","Data":"fc3410b91431b3d3fb19dbc67ecfc0afc10985ed5552ebb2d8cc441672fea42e"} Feb 17 12:50:11.725705 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.725685 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:50:11.726979 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.726955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54ff64c78c-cq6lm" event={"ID":"39d12be6-04d7-4fff-87ef-c2c7c4c18f58","Type":"ContainerStarted","Data":"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9"} Feb 17 12:50:11.727084 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.726982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54ff64c78c-cq6lm" event={"ID":"39d12be6-04d7-4fff-87ef-c2c7c4c18f58","Type":"ContainerStarted","Data":"55cca42129beaf651d3b9df32136a5b6d5452ceab39adad36d756aa1f89ed7c6"} Feb 17 12:50:11.727391 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.727364 2573 patch_prober.go:28] interesting pod/downloads-58b949d66d-pn5bm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.22:8080/\": dial tcp 10.133.0.22:8080: connect: connection refused" start-of-body= Feb 17 12:50:11.727482 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.727436 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-58b949d66d-pn5bm" podUID="056c76cf-5dfe-4900-898b-551996c808b0" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.22:8080/\": dial tcp 10.133.0.22:8080: connect: connection refused" Feb 17 12:50:11.740801 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.740555 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-58b949d66d-pn5bm" podStartSLOduration=1.77148702 podStartE2EDuration="23.740529123s" podCreationTimestamp="2026-02-17 12:49:48 +0000 UTC" firstStartedPulling="2026-02-17 12:49:49.244540742 +0000 UTC m=+212.839355547" lastFinishedPulling="2026-02-17 12:50:11.213582845 +0000 UTC m=+234.808397650" observedRunningTime="2026-02-17 12:50:11.740426947 +0000 UTC m=+235.335241772" watchObservedRunningTime="2026-02-17 12:50:11.740529123 +0000 UTC m=+235.335343949" Feb 17 12:50:11.768281 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:11.768219 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54ff64c78c-cq6lm" podStartSLOduration=9.768203295 podStartE2EDuration="9.768203295s" podCreationTimestamp="2026-02-17 12:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:50:11.767409701 +0000 UTC m=+235.362224527" watchObservedRunningTime="2026-02-17 12:50:11.768203295 +0000 UTC m=+235.363018124" Feb 17 12:50:12.730070 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:12.730034 2573 patch_prober.go:28] interesting pod/downloads-58b949d66d-pn5bm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.22:8080/\": dial tcp 10.133.0.22:8080: connect: connection refused" start-of-body= Feb 17 12:50:12.730463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:12.730091 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-58b949d66d-pn5bm" podUID="056c76cf-5dfe-4900-898b-551996c808b0" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.22:8080/\": dial tcp 10.133.0.22:8080: connect: connection refused" Feb 17 12:50:13.050807 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:13.050717 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:13.050807 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:13.050771 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:13.055472 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:13.055445 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:13.737033 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:13.737003 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:50:13.780246 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:13.780211 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:50:16.743225 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:16.743190 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a8cc667-aa21-4c52-810c-330a53bdcfd3" containerID="00fb147f5075f57b88d511b62780747405e80fecd32542403d2f34debc46e582" exitCode=0 Feb 17 12:50:16.743654 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:16.743266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" event={"ID":"1a8cc667-aa21-4c52-810c-330a53bdcfd3","Type":"ContainerDied","Data":"00fb147f5075f57b88d511b62780747405e80fecd32542403d2f34debc46e582"} Feb 17 12:50:16.743654 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:16.743647 2573 scope.go:117] "RemoveContainer" containerID="00fb147f5075f57b88d511b62780747405e80fecd32542403d2f34debc46e582" Feb 17 12:50:17.748377 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:17.748338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5d56856ff5-ctf9v" event={"ID":"1a8cc667-aa21-4c52-810c-330a53bdcfd3","Type":"ContainerStarted","Data":"785e45a2e02c7a4c09de194e96dddaff6d21c7e173234b574d27040738cf4b42"} Feb 17 12:50:17.750004 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:17.749971 2573 generic.go:358] "Generic (PLEG): container finished" podID="d1ead7d2-81f9-4afa-8d87-188a741e9848" containerID="8651a9eb76b7ec94f4afe9350de475b815883e238811c1f75668994d0240ba25" exitCode=0 Feb 17 12:50:17.750148 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:17.750047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" event={"ID":"d1ead7d2-81f9-4afa-8d87-188a741e9848","Type":"ContainerDied","Data":"8651a9eb76b7ec94f4afe9350de475b815883e238811c1f75668994d0240ba25"} Feb 17 12:50:17.750417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:17.750400 2573 scope.go:117] "RemoveContainer" containerID="8651a9eb76b7ec94f4afe9350de475b815883e238811c1f75668994d0240ba25" Feb 17 12:50:18.754526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:18.754494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs" event={"ID":"d1ead7d2-81f9-4afa-8d87-188a741e9848","Type":"ContainerStarted","Data":"b65e0b2b9c5c2eb42f9bc2bbcae9f953285fc0f17847a08475b06befca476931"} Feb 17 12:50:22.742653 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:22.742618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-58b949d66d-pn5bm" Feb 17 12:50:28.809593 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:28.809516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:50:28.811892 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:28.811867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad710990-167a-49aa-bad8-faa970a4c3bb-metrics-certs\") pod \"network-metrics-daemon-cnhns\" (UID: \"ad710990-167a-49aa-bad8-faa970a4c3bb\") " pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:50:28.933469 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:28.933438 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b7dtp\"" Feb 17 12:50:28.940690 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:28.940668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnhns" Feb 17 12:50:29.058495 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:29.058476 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cnhns"] Feb 17 12:50:29.060669 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:50:29.060604 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad710990_167a_49aa_bad8_faa970a4c3bb.slice/crio-10b535a05b09e021dd784d50b1b598bc0f2bf627e3123971c6138da4ee85901d WatchSource:0}: Error finding container 10b535a05b09e021dd784d50b1b598bc0f2bf627e3123971c6138da4ee85901d: Status 404 returned error can't find the container with id 10b535a05b09e021dd784d50b1b598bc0f2bf627e3123971c6138da4ee85901d Feb 17 12:50:29.787440 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:29.787403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnhns" event={"ID":"ad710990-167a-49aa-bad8-faa970a4c3bb","Type":"ContainerStarted","Data":"10b535a05b09e021dd784d50b1b598bc0f2bf627e3123971c6138da4ee85901d"} Feb 17 12:50:30.791819 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:30.791784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnhns" event={"ID":"ad710990-167a-49aa-bad8-faa970a4c3bb","Type":"ContainerStarted","Data":"25eb8609da7195d033e8414104dce71a1330cca21050afced39f47d62a405f1f"} Feb 17 12:50:30.791819 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:30.791825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnhns" event={"ID":"ad710990-167a-49aa-bad8-faa970a4c3bb","Type":"ContainerStarted","Data":"abeca63294f83df300edd09792d4d8417e04dc3a5183a52262ff6fbe1b7cb85f"} Feb 17 12:50:30.878249 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:30.878203 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cnhns" podStartSLOduration=252.97159954400001 podStartE2EDuration="4m13.878187461s" podCreationTimestamp="2026-02-17 12:46:17 +0000 UTC" firstStartedPulling="2026-02-17 12:50:29.062573152 +0000 UTC m=+252.657387959" lastFinishedPulling="2026-02-17 12:50:29.969161073 +0000 UTC m=+253.563975876" observedRunningTime="2026-02-17 12:50:30.877718511 +0000 UTC m=+254.472533333" watchObservedRunningTime="2026-02-17 12:50:30.878187461 +0000 UTC m=+254.473002285" Feb 17 12:50:31.798991 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:31.798957 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2d6500e-0397-48cd-bf45-464b40e47782" containerID="381643b9a63e2e1f088818effaea33dc673be79702dedab45168c62c80d4d7b0" exitCode=0 Feb 17 12:50:31.799397 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:31.799026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" event={"ID":"c2d6500e-0397-48cd-bf45-464b40e47782","Type":"ContainerDied","Data":"381643b9a63e2e1f088818effaea33dc673be79702dedab45168c62c80d4d7b0"} Feb 17 12:50:31.799475 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:31.799461 2573 scope.go:117] "RemoveContainer" containerID="381643b9a63e2e1f088818effaea33dc673be79702dedab45168c62c80d4d7b0" Feb 17 12:50:32.804604 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:32.804571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-ffd9f846b-scl5h" event={"ID":"c2d6500e-0397-48cd-bf45-464b40e47782","Type":"ContainerStarted","Data":"89fce1de9152a51719695eddd8a5d3230fdf041e20de2799fe523bd41faf582e"} Feb 17 12:50:38.803206 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:38.803166 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bf55cdf66-fmlfb" podUID="e4e63218-57aa-461f-be09-a74c47b5b491" containerName="console" containerID="cri-o://5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507" gracePeriod=15 Feb 17 12:50:39.077403 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.077380 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bf55cdf66-fmlfb_e4e63218-57aa-461f-be09-a74c47b5b491/console/0.log" Feb 17 12:50:39.077528 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.077454 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:39.195504 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195470 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.195688 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195514 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.195688 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195591 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.195688 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195639 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2q5\" (UniqueName: \"kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.195688 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195661 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.195878 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195709 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert\") pod \"e4e63218-57aa-461f-be09-a74c47b5b491\" (UID: \"e4e63218-57aa-461f-be09-a74c47b5b491\") " Feb 17 12:50:39.196007 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195973 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca" (OuterVolumeSpecName: "service-ca") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:50:39.196082 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.196010 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:50:39.196082 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.195985 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config" (OuterVolumeSpecName: "console-config") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:50:39.197931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.197905 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:50:39.198016 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.197922 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:50:39.198016 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.197936 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5" (OuterVolumeSpecName: "kube-api-access-4t2q5") pod "e4e63218-57aa-461f-be09-a74c47b5b491" (UID: "e4e63218-57aa-461f-be09-a74c47b5b491"). InnerVolumeSpecName "kube-api-access-4t2q5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:50:39.297191 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297154 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.297191 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297185 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4t2q5\" (UniqueName: \"kubernetes.io/projected/e4e63218-57aa-461f-be09-a74c47b5b491-kube-api-access-4t2q5\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.297191 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297199 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.297411 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297212 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e63218-57aa-461f-be09-a74c47b5b491-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.297411 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297228 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.297411 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.297240 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e63218-57aa-461f-be09-a74c47b5b491-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:50:39.830000 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.829971 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bf55cdf66-fmlfb_e4e63218-57aa-461f-be09-a74c47b5b491/console/0.log" Feb 17 12:50:39.830422 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.830009 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4e63218-57aa-461f-be09-a74c47b5b491" containerID="5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507" exitCode=2 Feb 17 12:50:39.830422 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.830074 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf55cdf66-fmlfb" Feb 17 12:50:39.830422 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.830095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf55cdf66-fmlfb" event={"ID":"e4e63218-57aa-461f-be09-a74c47b5b491","Type":"ContainerDied","Data":"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507"} Feb 17 12:50:39.830422 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.830150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf55cdf66-fmlfb" event={"ID":"e4e63218-57aa-461f-be09-a74c47b5b491","Type":"ContainerDied","Data":"b55ab9758245d9847ab5616dd7064060eea00852d7c8f171e61ec3dcbd7c2d0c"} Feb 17 12:50:39.830422 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.830166 2573 scope.go:117] "RemoveContainer" containerID="5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507" Feb 17 12:50:39.838854 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.838837 2573 scope.go:117] "RemoveContainer" containerID="5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507" Feb 17 12:50:39.839128 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:50:39.839088 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507\": container with ID starting with 5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507 not found: ID does not exist" containerID="5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507" Feb 17 12:50:39.839192 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.839135 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507"} err="failed to get container status \"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507\": rpc error: code = NotFound desc = could not find container \"5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507\": container with ID starting with 5569b28d4b7300b9a9fba5f63f20f427932dcb5ae60bf620b9ba081bbedff507 not found: ID does not exist" Feb 17 12:50:39.850738 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.850714 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:50:39.853890 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:39.853871 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bf55cdf66-fmlfb"] Feb 17 12:50:40.933180 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:40.933149 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e63218-57aa-461f-be09-a74c47b5b491" path="/var/lib/kubelet/pods/e4e63218-57aa-461f-be09-a74c47b5b491/volumes" Feb 17 12:50:53.081128 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.081072 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:50:53.081832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.081546 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e63218-57aa-461f-be09-a74c47b5b491" containerName="console" Feb 17 12:50:53.081832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.081567 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e63218-57aa-461f-be09-a74c47b5b491" containerName="console" Feb 17 12:50:53.081832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.081669 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e63218-57aa-461f-be09-a74c47b5b491" containerName="console" Feb 17 12:50:53.087628 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.087606 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.096269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.096241 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:50:53.214670 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.214855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspwq\" (UniqueName: \"kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.214855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.214855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.214855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.215065 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.215065 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.214985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.315862 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.315862 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rspwq\" (UniqueName: \"kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.315987 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.316007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316793 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.316765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.316793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.316931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.316860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.317184 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.317163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.318403 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.318379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.318527 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.318507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.323567 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.323541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspwq\" (UniqueName: \"kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq\") pod \"console-74b4d54459-cnrtv\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.400291 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.400258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:50:53.549289 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.549266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:50:53.551380 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:50:53.551342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadf4e60_1e2b_422f_8b32_1c63868d413f.slice/crio-a210d7987cf0d07be9c1e556c6a70c11332e9ec7fe7af1d7145369bc3d46b31b WatchSource:0}: Error finding container a210d7987cf0d07be9c1e556c6a70c11332e9ec7fe7af1d7145369bc3d46b31b: Status 404 returned error can't find the container with id a210d7987cf0d07be9c1e556c6a70c11332e9ec7fe7af1d7145369bc3d46b31b Feb 17 12:50:53.879948 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.879865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b4d54459-cnrtv" event={"ID":"eadf4e60-1e2b-422f-8b32-1c63868d413f","Type":"ContainerStarted","Data":"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e"} Feb 17 12:50:53.879948 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.879901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b4d54459-cnrtv" event={"ID":"eadf4e60-1e2b-422f-8b32-1c63868d413f","Type":"ContainerStarted","Data":"a210d7987cf0d07be9c1e556c6a70c11332e9ec7fe7af1d7145369bc3d46b31b"} Feb 17 12:50:53.927791 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:50:53.927746 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74b4d54459-cnrtv" podStartSLOduration=0.927731647 podStartE2EDuration="927.731647ms" podCreationTimestamp="2026-02-17 12:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:50:53.926974041 +0000 UTC m=+277.521788867" watchObservedRunningTime="2026-02-17 12:50:53.927731647 +0000 UTC m=+277.522546474" Feb 17 12:51:03.401009 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:03.400969 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:51:03.401439 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:03.401025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:51:03.405826 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:03.405804 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:51:03.912945 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:03.912920 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:51:03.955266 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:03.955237 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:51:16.872735 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:16.872707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:51:16.873193 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:16.872780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:51:16.880417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:16.880390 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:51:16.880658 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:16.880632 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:51:16.884088 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:16.884068 2573 kubelet.go:1628] "Image garbage collection succeeded" Feb 17 12:51:28.976212 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:28.976166 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54ff64c78c-cq6lm" podUID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" containerName="console" containerID="cri-o://b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9" gracePeriod=15 Feb 17 12:51:29.210351 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.210328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54ff64c78c-cq6lm_39d12be6-04d7-4fff-87ef-c2c7c4c18f58/console/0.log" Feb 17 12:51:29.210489 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.210390 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:51:29.326047 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.325951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326047 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326008 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326047 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326037 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326361 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326071 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326361 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326136 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l7xf\" (UniqueName: \"kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326361 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326166 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326361 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326184 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert\") pod \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\" (UID: \"39d12be6-04d7-4fff-87ef-c2c7c4c18f58\") " Feb 17 12:51:29.326565 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326516 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config" (OuterVolumeSpecName: "console-config") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:51:29.326671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326631 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:51:29.326792 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326699 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca" (OuterVolumeSpecName: "service-ca") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:51:29.326792 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326714 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:51:29.326908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326796 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-trusted-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.326908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326810 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.326908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326822 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.326908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.326834 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.328438 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.328416 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:51:29.328748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.328722 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:51:29.328832 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.328742 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf" (OuterVolumeSpecName: "kube-api-access-6l7xf") pod "39d12be6-04d7-4fff-87ef-c2c7c4c18f58" (UID: "39d12be6-04d7-4fff-87ef-c2c7c4c18f58"). InnerVolumeSpecName "kube-api-access-6l7xf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:51:29.427721 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.427682 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.427721 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.427716 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6l7xf\" (UniqueName: \"kubernetes.io/projected/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-kube-api-access-6l7xf\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.427721 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.427726 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39d12be6-04d7-4fff-87ef-c2c7c4c18f58-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:51:29.987222 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987196 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54ff64c78c-cq6lm_39d12be6-04d7-4fff-87ef-c2c7c4c18f58/console/0.log" Feb 17 12:51:29.987626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987234 2573 generic.go:358] "Generic (PLEG): container finished" podID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" containerID="b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9" exitCode=2 Feb 17 12:51:29.987626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987303 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54ff64c78c-cq6lm" Feb 17 12:51:29.987626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54ff64c78c-cq6lm" event={"ID":"39d12be6-04d7-4fff-87ef-c2c7c4c18f58","Type":"ContainerDied","Data":"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9"} Feb 17 12:51:29.987626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54ff64c78c-cq6lm" event={"ID":"39d12be6-04d7-4fff-87ef-c2c7c4c18f58","Type":"ContainerDied","Data":"55cca42129beaf651d3b9df32136a5b6d5452ceab39adad36d756aa1f89ed7c6"} Feb 17 12:51:29.987626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.987368 2573 scope.go:117] "RemoveContainer" containerID="b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9" Feb 17 12:51:29.996653 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.996637 2573 scope.go:117] "RemoveContainer" containerID="b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9" Feb 17 12:51:29.996894 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:51:29.996878 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9\": container with ID starting with b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9 not found: ID does not exist" containerID="b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9" Feb 17 12:51:29.996931 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:29.996905 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9"} err="failed to get container status \"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9\": rpc error: code = NotFound desc = could not find container \"b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9\": container with ID starting with b808f703653d8f350914fd0349d1cbbbd8accb1da3ec3ea0eb72f6a4d6331bf9 not found: ID does not exist" Feb 17 12:51:30.007836 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:30.007816 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:51:30.011100 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:30.011079 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54ff64c78c-cq6lm"] Feb 17 12:51:30.933220 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:51:30.933188 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" path="/var/lib/kubelet/pods/39d12be6-04d7-4fff-87ef-c2c7c4c18f58/volumes" Feb 17 12:52:43.140034 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.139997 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:52:43.140526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.140323 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" containerName="console" Feb 17 12:52:43.140526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.140335 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" containerName="console" Feb 17 12:52:43.140526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.140399 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="39d12be6-04d7-4fff-87ef-c2c7c4c18f58" containerName="console" Feb 17 12:52:43.143280 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.143264 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:43.145908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.145883 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"kube-root-ca.crt\"" Feb 17 12:52:43.148323 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.148305 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"openshift-service-ca.crt\"" Feb 17 12:52:43.148323 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.148313 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"feast-operator-system\"/\"feast-operator-controller-manager-dockercfg-6l6zm\"" Feb 17 12:52:43.152494 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.152469 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:52:43.229395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.229344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m96jf\" (UniqueName: \"kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf\") pod \"feast-operator-controller-manager-8c74c7748-lm82l\" (UID: \"52e1160d-5658-43c9-b4a3-eee99ff456aa\") " pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:43.330424 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.330391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m96jf\" (UniqueName: \"kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf\") pod \"feast-operator-controller-manager-8c74c7748-lm82l\" (UID: \"52e1160d-5658-43c9-b4a3-eee99ff456aa\") " pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:43.337925 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.337904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m96jf\" (UniqueName: \"kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf\") pod \"feast-operator-controller-manager-8c74c7748-lm82l\" (UID: \"52e1160d-5658-43c9-b4a3-eee99ff456aa\") " pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:43.453952 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.453917 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:43.572928 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.572901 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:52:43.575595 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:52:43.575565 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e1160d_5658_43c9_b4a3_eee99ff456aa.slice/crio-4526cbba45fe191da2e738a22524229a75dfbdbbdb77f36b40db60672b1b39ec WatchSource:0}: Error finding container 4526cbba45fe191da2e738a22524229a75dfbdbbdb77f36b40db60672b1b39ec: Status 404 returned error can't find the container with id 4526cbba45fe191da2e738a22524229a75dfbdbbdb77f36b40db60672b1b39ec Feb 17 12:52:43.577750 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:43.577731 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 12:52:44.203468 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:44.203424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" event={"ID":"52e1160d-5658-43c9-b4a3-eee99ff456aa","Type":"ContainerStarted","Data":"4526cbba45fe191da2e738a22524229a75dfbdbbdb77f36b40db60672b1b39ec"} Feb 17 12:52:46.210379 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:46.210344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" event={"ID":"52e1160d-5658-43c9-b4a3-eee99ff456aa","Type":"ContainerStarted","Data":"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d"} Feb 17 12:52:46.210805 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:46.210457 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:52:46.225303 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:46.225251 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" podStartSLOduration=1.013222527 podStartE2EDuration="3.22523577s" podCreationTimestamp="2026-02-17 12:52:43 +0000 UTC" firstStartedPulling="2026-02-17 12:52:43.57788941 +0000 UTC m=+387.172704218" lastFinishedPulling="2026-02-17 12:52:45.789902658 +0000 UTC m=+389.384717461" observedRunningTime="2026-02-17 12:52:46.223838863 +0000 UTC m=+389.818653687" watchObservedRunningTime="2026-02-17 12:52:46.22523577 +0000 UTC m=+389.820050600" Feb 17 12:52:57.215444 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:52:57.215414 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:53:03.620976 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.620936 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:53:03.624875 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.624856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.627384 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.627364 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-registry-tls\"" Feb 17 12:53:03.627605 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.627439 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-online-tls\"" Feb 17 12:53:03.627677 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.627625 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-ui-tls\"" Feb 17 12:53:03.627744 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.627688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-offline-tls\"" Feb 17 12:53:03.628773 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.628747 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"kube-root-ca.crt\"" Feb 17 12:53:03.628888 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.628805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-z67q6\"" Feb 17 12:53:03.629104 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.629086 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"openshift-service-ca.crt\"" Feb 17 12:53:03.636569 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.636546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:53:03.693890 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.693857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x687b\" (UniqueName: \"kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.694052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.693914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.694052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.693940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.694052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.693975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.694052 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.694037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.694202 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.694081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.794999 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.794959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795183 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x687b\" (UniqueName: \"kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795183 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795319 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795319 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795444 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:53:03.795336 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:53:03.795444 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:53:03.795409 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls podName:f8907d0a-e6e5-4fe4-8a01-67376d754548 nodeName:}" failed. No retries permitted until 2026-02-17 12:53:04.295389313 +0000 UTC m=+407.890204138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls") pod "feast-simple-feast-setup-5887dd77db-mp2mz" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:53:03.795444 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.795734 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.795708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.797580 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.797552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.797699 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.797675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.797795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.797777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:03.803470 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:03.803439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x687b\" (UniqueName: \"kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:04.299855 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:04.299822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:04.302176 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:04.302154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-mp2mz\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:04.543770 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:04.543736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:04.666508 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:04.666477 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:53:04.667471 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:53:04.667442 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8907d0a_e6e5_4fe4_8a01_67376d754548.slice/crio-b3bffed5361c2a269c9771b404a6f0ddc760de4a28cf07641af23798ff102a5b WatchSource:0}: Error finding container b3bffed5361c2a269c9771b404a6f0ddc760de4a28cf07641af23798ff102a5b: Status 404 returned error can't find the container with id b3bffed5361c2a269c9771b404a6f0ddc760de4a28cf07641af23798ff102a5b Feb 17 12:53:05.265287 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:05.265250 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"b3bffed5361c2a269c9771b404a6f0ddc760de4a28cf07641af23798ff102a5b"} Feb 17 12:53:21.321172 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:21.321078 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"8d004835785ca90ff8499a3302b95fd34438546088fd5474f31f8a547e6e1eef"} Feb 17 12:53:26.340414 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:26.340379 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerID="8d004835785ca90ff8499a3302b95fd34438546088fd5474f31f8a547e6e1eef" exitCode=0 Feb 17 12:53:26.340840 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:26.340449 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"8d004835785ca90ff8499a3302b95fd34438546088fd5474f31f8a547e6e1eef"} Feb 17 12:53:27.348078 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:27.348036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"fb2db8f9cad50a5f1317270e9013ac78985901739607b620478af11ec5ab9a57"} Feb 17 12:53:27.348078 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:27.348083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"2842cc3d66ccb018b97ffb9d1bd00589e2d873342b6e628ee4684aa1489c533b"} Feb 17 12:53:27.348680 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:27.348097 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"fe4bd4b8e1e4326d51983c68e9592330c37f141b288ede4b1dc134ced4866b5d"} Feb 17 12:53:27.348680 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:27.348129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerStarted","Data":"0bf97e45345faa16ee67c07d99521f810ae2aa2b0ea4e26080d06feecec8811d"} Feb 17 12:53:27.370385 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:27.370294 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podStartSLOduration=7.9893151620000005 podStartE2EDuration="24.370272095s" podCreationTimestamp="2026-02-17 12:53:03 +0000 UTC" firstStartedPulling="2026-02-17 12:53:04.669331316 +0000 UTC m=+408.264146124" lastFinishedPulling="2026-02-17 12:53:21.05028825 +0000 UTC m=+424.645103057" observedRunningTime="2026-02-17 12:53:27.36739373 +0000 UTC m=+430.962208557" watchObservedRunningTime="2026-02-17 12:53:27.370272095 +0000 UTC m=+430.965086917" Feb 17 12:53:28.544402 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.544188 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:28.544904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.544873 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:28.544904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.544905 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:28.545076 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.544919 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:28.546257 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.546224 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.27:8443: connect: connection refused" Feb 17 12:53:28.546365 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.546267 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" probeResult="failure" output="Get \"https://10.133.0.27:6567/health\": dial tcp 10.133.0.27:6567: connect: connection refused" Feb 17 12:53:28.546365 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.546356 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.27:8816: connect: connection refused" Feb 17 12:53:28.546495 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:28.546405 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.27:6571: connect: connection refused" Feb 17 12:53:31.545285 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545233 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.545771 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545337 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.545771 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545548 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.545917 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545893 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.546036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545925 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.546036 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.545939 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.546181 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.546099 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.546252 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.546231 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.546608 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.546591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:31.550360 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:31.550344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:32.371009 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:32.370979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:53:32.373898 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:53:32.373879 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:54:35.072220 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.072184 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:54:35.072728 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.072589 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="registry" containerID="cri-o://0bf97e45345faa16ee67c07d99521f810ae2aa2b0ea4e26080d06feecec8811d" gracePeriod=30 Feb 17 12:54:35.072728 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.072685 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="offline" containerID="cri-o://2842cc3d66ccb018b97ffb9d1bd00589e2d873342b6e628ee4684aa1489c533b" gracePeriod=30 Feb 17 12:54:35.072874 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.072701 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" containerID="cri-o://fb2db8f9cad50a5f1317270e9013ac78985901739607b620478af11ec5ab9a57" gracePeriod=30 Feb 17 12:54:35.072874 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.072734 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" containerID="cri-o://fe4bd4b8e1e4326d51983c68e9592330c37f141b288ede4b1dc134ced4866b5d" gracePeriod=30 Feb 17 12:54:35.268661 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.268610 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:54:35.273138 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.273085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.275901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.275586 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-pl96b\"" Feb 17 12:54:35.281919 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.281889 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:54:35.377062 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.376963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.377062 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.377017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.377325 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.377071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.377325 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.377179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjps\" (UniqueName: \"kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.377325 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.377213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.377325 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.377273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478301 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478505 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjps\" (UniqueName: \"kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478505 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478505 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478505 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478505 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.478824 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:54:35.478606 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:54:35.478824 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:54:35.478680 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls podName:57fb8008-7e68-48e7-9bde-4a9f7e6301a0 nodeName:}" failed. No retries permitted until 2026-02-17 12:54:35.978656865 +0000 UTC m=+499.573471704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls") pod "feast-simple-feast-setup-5887dd77db-qzdss" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:54:35.478824 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.478768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.479329 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:54:35.478901 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-online-tls: secret "feast-simple-feast-setup-online-tls" not found Feb 17 12:54:35.479329 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:54:35.478957 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls podName:57fb8008-7e68-48e7-9bde-4a9f7e6301a0 nodeName:}" failed. No retries permitted until 2026-02-17 12:54:35.978931363 +0000 UTC m=+499.573746166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls") pod "feast-simple-feast-setup-5887dd77db-qzdss" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0") : secret "feast-simple-feast-setup-online-tls" not found Feb 17 12:54:35.481654 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.481626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.481780 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.481661 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.489173 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.489144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjps\" (UniqueName: \"kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.982836 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.982748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.982836 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.982793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.985208 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.985182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:35.985350 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:35.985323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") pod \"feast-simple-feast-setup-5887dd77db-qzdss\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:36.187526 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.187490 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:36.313182 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.313151 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:54:36.314099 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:54:36.314073 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57fb8008_7e68_48e7_9bde_4a9f7e6301a0.slice/crio-783c8bd8a7ebf71f006aaddd68a20016f81b0fe4a6fb8fc486110064834a4d52 WatchSource:0}: Error finding container 783c8bd8a7ebf71f006aaddd68a20016f81b0fe4a6fb8fc486110064834a4d52: Status 404 returned error can't find the container with id 783c8bd8a7ebf71f006aaddd68a20016f81b0fe4a6fb8fc486110064834a4d52 Feb 17 12:54:36.569177 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.569068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f"} Feb 17 12:54:36.569177 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.569125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"783c8bd8a7ebf71f006aaddd68a20016f81b0fe4a6fb8fc486110064834a4d52"} Feb 17 12:54:36.572497 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.572468 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerID="fb2db8f9cad50a5f1317270e9013ac78985901739607b620478af11ec5ab9a57" exitCode=0 Feb 17 12:54:36.572497 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.572493 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerID="fe4bd4b8e1e4326d51983c68e9592330c37f141b288ede4b1dc134ced4866b5d" exitCode=0 Feb 17 12:54:36.572703 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.572530 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"fb2db8f9cad50a5f1317270e9013ac78985901739607b620478af11ec5ab9a57"} Feb 17 12:54:36.572703 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:36.572552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"fe4bd4b8e1e4326d51983c68e9592330c37f141b288ede4b1dc134ced4866b5d"} Feb 17 12:54:39.581848 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:39.581817 2573 generic.go:358] "Generic (PLEG): container finished" podID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerID="04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f" exitCode=0 Feb 17 12:54:39.582228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:39.581864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f"} Feb 17 12:54:40.589421 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:40.589376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880"} Feb 17 12:54:40.589911 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:40.589434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b"} Feb 17 12:54:40.589911 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:40.589451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f"} Feb 17 12:54:40.589911 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:40.589465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerStarted","Data":"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba"} Feb 17 12:54:40.614627 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:40.614568 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podStartSLOduration=5.614547931 podStartE2EDuration="5.614547931s" podCreationTimestamp="2026-02-17 12:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:54:40.611489858 +0000 UTC m=+504.206304704" watchObservedRunningTime="2026-02-17 12:54:40.614547931 +0000 UTC m=+504.209362757" Feb 17 12:54:41.546786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:41.546733 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.27:8443: connect: connection refused" Feb 17 12:54:42.188390 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.188353 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:42.188922 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.188904 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:42.189023 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.189013 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:42.189131 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.189103 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:42.190595 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.190485 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.28:6571: connect: connection refused" Feb 17 12:54:42.190595 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.190516 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" probeResult="failure" output="Get \"https://10.133.0.28:6567/health\": dial tcp 10.133.0.28:6567: connect: connection refused" Feb 17 12:54:42.190783 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.190612 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.28:8443: connect: connection refused" Feb 17 12:54:42.190783 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.190723 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.28:8816: connect: connection refused" Feb 17 12:54:42.371838 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:42.371792 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" probeResult="failure" output="Get \"https://10.133.0.27:6567/health\": dial tcp 10.133.0.27:6567: connect: connection refused" Feb 17 12:54:45.188791 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.188757 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.189297 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.189014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.189359 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.189307 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.189359 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.189332 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.189492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.189478 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.193707 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.193686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.607060 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.606976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.607060 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.607016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.607060 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.607028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.607346 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.607280 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.607488 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.607470 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:45.609881 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:45.609863 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:54:51.546721 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:51.546668 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.27:8443: connect: connection refused" Feb 17 12:54:52.371732 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:54:52.371690 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" probeResult="failure" output="Get \"https://10.133.0.27:6567/health\": dial tcp 10.133.0.27:6567: connect: connection refused" Feb 17 12:55:01.546175 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:01.546052 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.27:8443: connect: connection refused" Feb 17 12:55:01.546580 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:01.546227 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:55:02.372197 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:02.372153 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" probeResult="failure" output="Get \"https://10.133.0.27:6567/health\": dial tcp 10.133.0.27:6567: connect: connection refused" Feb 17 12:55:02.372399 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:02.372265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:55:05.009247 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.009212 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56554b9687-q8sz2"] Feb 17 12:55:05.012522 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.012496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.029048 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.024549 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56554b9687-q8sz2"] Feb 17 12:55:05.046154 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-trusted-ca-bundle\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046298 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-oauth-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046298 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046389 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-oauth-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046423 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfwz\" (UniqueName: \"kubernetes.io/projected/ae8b475f-3383-4364-9d53-9f68d6cf3f63-kube-api-access-8tfwz\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046502 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-service-ca\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.046536 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.046526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147367 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-service-ca\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147483 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147483 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-trusted-ca-bundle\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147583 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-oauth-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147583 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147583 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-oauth-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.147722 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.147589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfwz\" (UniqueName: \"kubernetes.io/projected/ae8b475f-3383-4364-9d53-9f68d6cf3f63-kube-api-access-8tfwz\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.148201 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.148175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-service-ca\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.148310 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.148236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-oauth-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.148353 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.148301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.148353 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.148307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae8b475f-3383-4364-9d53-9f68d6cf3f63-trusted-ca-bundle\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.149968 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.149948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-oauth-config\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.150157 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.150140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8b475f-3383-4364-9d53-9f68d6cf3f63-console-serving-cert\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.155181 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.155160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfwz\" (UniqueName: \"kubernetes.io/projected/ae8b475f-3383-4364-9d53-9f68d6cf3f63-kube-api-access-8tfwz\") pod \"console-56554b9687-q8sz2\" (UID: \"ae8b475f-3383-4364-9d53-9f68d6cf3f63\") " pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.325311 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.325220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:05.446608 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.446578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56554b9687-q8sz2"] Feb 17 12:55:05.450156 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:55:05.450103 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8b475f_3383_4364_9d53_9f68d6cf3f63.slice/crio-5ebe258a936d663ee7977b9110928c64a1fe66502c5082383fb96bc4bef0c5b2 WatchSource:0}: Error finding container 5ebe258a936d663ee7977b9110928c64a1fe66502c5082383fb96bc4bef0c5b2: Status 404 returned error can't find the container with id 5ebe258a936d663ee7977b9110928c64a1fe66502c5082383fb96bc4bef0c5b2 Feb 17 12:55:05.677252 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.677219 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerID="2842cc3d66ccb018b97ffb9d1bd00589e2d873342b6e628ee4684aa1489c533b" exitCode=137 Feb 17 12:55:05.677252 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.677250 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerID="0bf97e45345faa16ee67c07d99521f810ae2aa2b0ea4e26080d06feecec8811d" exitCode=137 Feb 17 12:55:05.677470 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.677288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"2842cc3d66ccb018b97ffb9d1bd00589e2d873342b6e628ee4684aa1489c533b"} Feb 17 12:55:05.677470 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.677323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"0bf97e45345faa16ee67c07d99521f810ae2aa2b0ea4e26080d06feecec8811d"} Feb 17 12:55:05.678644 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.678623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56554b9687-q8sz2" event={"ID":"ae8b475f-3383-4364-9d53-9f68d6cf3f63","Type":"ContainerStarted","Data":"428d0f25be1813391b81cc39d5e79074308b60acf01d33b182c794f56d9ba5b6"} Feb 17 12:55:05.678726 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.678649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56554b9687-q8sz2" event={"ID":"ae8b475f-3383-4364-9d53-9f68d6cf3f63","Type":"ContainerStarted","Data":"5ebe258a936d663ee7977b9110928c64a1fe66502c5082383fb96bc4bef0c5b2"} Feb 17 12:55:05.697129 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.697072 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56554b9687-q8sz2" podStartSLOduration=1.697061368 podStartE2EDuration="1.697061368s" podCreationTimestamp="2026-02-17 12:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:55:05.694956236 +0000 UTC m=+529.289771046" watchObservedRunningTime="2026-02-17 12:55:05.697061368 +0000 UTC m=+529.291876193" Feb 17 12:55:05.710531 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.710515 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:55:05.752151 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752099 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752330 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752183 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x687b\" (UniqueName: \"kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752330 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752216 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752330 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752271 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752330 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752327 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752542 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752371 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") pod \"f8907d0a-e6e5-4fe4-8a01-67376d754548\" (UID: \"f8907d0a-e6e5-4fe4-8a01-67376d754548\") " Feb 17 12:55:05.752860 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.752826 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data" (OuterVolumeSpecName: "feast-data") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 12:55:05.754954 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.754919 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:05.755059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.755036 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls" (OuterVolumeSpecName: "online-tls") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:05.755328 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.755057 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b" (OuterVolumeSpecName: "kube-api-access-x687b") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "kube-api-access-x687b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:55:05.755328 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.755064 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:05.755328 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.755307 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f8907d0a-e6e5-4fe4-8a01-67376d754548" (UID: "f8907d0a-e6e5-4fe4-8a01-67376d754548"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853180 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853212 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853227 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x687b\" (UniqueName: \"kubernetes.io/projected/f8907d0a-e6e5-4fe4-8a01-67376d754548-kube-api-access-x687b\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853237 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/f8907d0a-e6e5-4fe4-8a01-67376d754548-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853246 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:05.853269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:05.853254 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/f8907d0a-e6e5-4fe4-8a01-67376d754548-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:06.684837 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.684808 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" Feb 17 12:55:06.684837 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.684816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz" event={"ID":"f8907d0a-e6e5-4fe4-8a01-67376d754548","Type":"ContainerDied","Data":"b3bffed5361c2a269c9771b404a6f0ddc760de4a28cf07641af23798ff102a5b"} Feb 17 12:55:06.685389 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.684866 2573 scope.go:117] "RemoveContainer" containerID="fb2db8f9cad50a5f1317270e9013ac78985901739607b620478af11ec5ab9a57" Feb 17 12:55:06.697097 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.697078 2573 scope.go:117] "RemoveContainer" containerID="2842cc3d66ccb018b97ffb9d1bd00589e2d873342b6e628ee4684aa1489c533b" Feb 17 12:55:06.704778 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.704759 2573 scope.go:117] "RemoveContainer" containerID="fe4bd4b8e1e4326d51983c68e9592330c37f141b288ede4b1dc134ced4866b5d" Feb 17 12:55:06.709192 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.709170 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:55:06.713554 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.713530 2573 scope.go:117] "RemoveContainer" containerID="0bf97e45345faa16ee67c07d99521f810ae2aa2b0ea4e26080d06feecec8811d" Feb 17 12:55:06.714000 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.713982 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-mp2mz"] Feb 17 12:55:06.721140 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.721101 2573 scope.go:117] "RemoveContainer" containerID="8d004835785ca90ff8499a3302b95fd34438546088fd5474f31f8a547e6e1eef" Feb 17 12:55:06.933616 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:06.933585 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" path="/var/lib/kubelet/pods/f8907d0a-e6e5-4fe4-8a01-67376d754548/volumes" Feb 17 12:55:15.325945 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:15.325907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:15.326443 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:15.325957 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:15.330403 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:15.330384 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:15.718645 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:15.718615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56554b9687-q8sz2" Feb 17 12:55:15.761046 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:15.761013 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:55:40.781046 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:40.780987 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74b4d54459-cnrtv" podUID="eadf4e60-1e2b-422f-8b32-1c63868d413f" containerName="console" containerID="cri-o://f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e" gracePeriod=15 Feb 17 12:55:41.023839 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.023818 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b4d54459-cnrtv_eadf4e60-1e2b-422f-8b32-1c63868d413f/console/0.log" Feb 17 12:55:41.023992 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.023877 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:55:41.055085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055026 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rspwq\" (UniqueName: \"kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055056 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055085 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055080 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055172 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055226 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055258 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055282 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert\") pod \"eadf4e60-1e2b-422f-8b32-1c63868d413f\" (UID: \"eadf4e60-1e2b-422f-8b32-1c63868d413f\") " Feb 17 12:55:41.055602 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055503 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca" (OuterVolumeSpecName: "service-ca") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:55:41.055677 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055618 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:55:41.055733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055673 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config" (OuterVolumeSpecName: "console-config") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:55:41.055733 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055693 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:55:41.055881 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055865 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.055933 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055886 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.055933 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055899 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.055933 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.055908 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eadf4e60-1e2b-422f-8b32-1c63868d413f-trusted-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.057244 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.057225 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:41.057634 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.057614 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:55:41.057705 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.057630 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq" (OuterVolumeSpecName: "kube-api-access-rspwq") pod "eadf4e60-1e2b-422f-8b32-1c63868d413f" (UID: "eadf4e60-1e2b-422f-8b32-1c63868d413f"). InnerVolumeSpecName "kube-api-access-rspwq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:55:41.156417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.156389 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rspwq\" (UniqueName: \"kubernetes.io/projected/eadf4e60-1e2b-422f-8b32-1c63868d413f-kube-api-access-rspwq\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.156417 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.156416 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.156592 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.156426 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eadf4e60-1e2b-422f-8b32-1c63868d413f-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:55:41.788570 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788545 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b4d54459-cnrtv_eadf4e60-1e2b-422f-8b32-1c63868d413f/console/0.log" Feb 17 12:55:41.788987 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788587 2573 generic.go:358] "Generic (PLEG): container finished" podID="eadf4e60-1e2b-422f-8b32-1c63868d413f" containerID="f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e" exitCode=2 Feb 17 12:55:41.788987 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788649 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b4d54459-cnrtv" Feb 17 12:55:41.788987 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b4d54459-cnrtv" event={"ID":"eadf4e60-1e2b-422f-8b32-1c63868d413f","Type":"ContainerDied","Data":"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e"} Feb 17 12:55:41.788987 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b4d54459-cnrtv" event={"ID":"eadf4e60-1e2b-422f-8b32-1c63868d413f","Type":"ContainerDied","Data":"a210d7987cf0d07be9c1e556c6a70c11332e9ec7fe7af1d7145369bc3d46b31b"} Feb 17 12:55:41.788987 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.788716 2573 scope.go:117] "RemoveContainer" containerID="f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e" Feb 17 12:55:41.798140 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.798099 2573 scope.go:117] "RemoveContainer" containerID="f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e" Feb 17 12:55:41.798396 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:55:41.798379 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e\": container with ID starting with f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e not found: ID does not exist" containerID="f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e" Feb 17 12:55:41.798436 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.798405 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e"} err="failed to get container status \"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e\": rpc error: code = NotFound desc = could not find container \"f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e\": container with ID starting with f8fb1e5f5909aa99cdd7a13b0fdf1f72d9531b1cef124ca7e5c585fbed4c8b6e not found: ID does not exist" Feb 17 12:55:41.810268 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.810243 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:55:41.813665 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:41.813640 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74b4d54459-cnrtv"] Feb 17 12:55:42.933312 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:42.933281 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eadf4e60-1e2b-422f-8b32-1c63868d413f" path="/var/lib/kubelet/pods/eadf4e60-1e2b-422f-8b32-1c63868d413f/volumes" Feb 17 12:55:46.838254 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.838214 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:55:46.838936 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.838912 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.838939 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.838963 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.838971 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839005 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eadf4e60-1e2b-422f-8b32-1c63868d413f" containerName="console" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eadf4e60-1e2b-422f-8b32-1c63868d413f" containerName="console" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839024 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="registry" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839032 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="registry" Feb 17 12:55:46.839039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839040 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="offline" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839049 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="offline" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839068 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="feast-init" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839077 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="feast-init" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839190 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eadf4e60-1e2b-422f-8b32-1c63868d413f" containerName="console" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839204 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="registry" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839218 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="ui" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839227 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="offline" Feb 17 12:55:46.839463 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.839238 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8907d0a-e6e5-4fe4-8a01-67376d754548" containerName="online" Feb 17 12:55:46.844296 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.844274 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.846730 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.846694 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-ui-tls\"" Feb 17 12:55:46.846896 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.846789 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-offline-tls\"" Feb 17 12:55:46.847053 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.846981 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-online-tls\"" Feb 17 12:55:46.847053 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.847001 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-dockercfg-xrd5p\"" Feb 17 12:55:46.847219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.846988 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"openshift-service-ca.crt\"" Feb 17 12:55:46.847219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.847194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-client-ca\"" Feb 17 12:55:46.847347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.847239 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"kube-root-ca.crt\"" Feb 17 12:55:46.851970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.851773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:55:46.903185 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.903341 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.903398 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.903445 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.903492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:46.903492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:46.903475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbgw\" (UniqueName: \"kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.004859 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.004806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.004859 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.004862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005143 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.004959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005143 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:55:47.004975 2573 secret.go:189] Couldn't get secret test-ns-remote-registry/feast-simple-feast-remote-setup-online-tls: secret "feast-simple-feast-remote-setup-online-tls" not found Feb 17 12:55:47.005143 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:55:47.005006 2573 secret.go:189] Couldn't get secret test-ns-remote-registry/feast-simple-feast-remote-setup-ui-tls: secret "feast-simple-feast-remote-setup-ui-tls" not found Feb 17 12:55:47.005143 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.004984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005143 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:55:47.005095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls podName:3252cc01-47da-4470-ab0b-c36315dec301 nodeName:}" failed. No retries permitted until 2026-02-17 12:55:47.505045775 +0000 UTC m=+571.099860591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls") pod "feast-simple-feast-remote-setup-558f8f8769-c7p6h" (UID: "3252cc01-47da-4470-ab0b-c36315dec301") : secret "feast-simple-feast-remote-setup-ui-tls" not found Feb 17 12:55:47.005430 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:55:47.005149 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls podName:3252cc01-47da-4470-ab0b-c36315dec301 nodeName:}" failed. No retries permitted until 2026-02-17 12:55:47.50513868 +0000 UTC m=+571.099953489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls") pod "feast-simple-feast-remote-setup-558f8f8769-c7p6h" (UID: "3252cc01-47da-4470-ab0b-c36315dec301") : secret "feast-simple-feast-remote-setup-online-tls" not found Feb 17 12:55:47.005430 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.005184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005430 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.005217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbgw\" (UniqueName: \"kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005430 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.005380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.005829 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.005810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.007413 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.007385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.015518 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.015493 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbgw\" (UniqueName: \"kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.511296 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.511249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.511296 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.511302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.513730 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.513700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.513730 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.513722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") pod \"feast-simple-feast-remote-setup-558f8f8769-c7p6h\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.754414 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.754383 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:47.880285 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:47.880257 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:55:47.882222 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:55:47.882194 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3252cc01_47da_4470_ab0b_c36315dec301.slice/crio-b6b08feb6bd06f322d8f16ed8cdfd8634419c6081617f2a3c4be85f767758d7c WatchSource:0}: Error finding container b6b08feb6bd06f322d8f16ed8cdfd8634419c6081617f2a3c4be85f767758d7c: Status 404 returned error can't find the container with id b6b08feb6bd06f322d8f16ed8cdfd8634419c6081617f2a3c4be85f767758d7c Feb 17 12:55:48.813295 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:48.813257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerStarted","Data":"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39"} Feb 17 12:55:48.813295 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:48.813292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerStarted","Data":"b6b08feb6bd06f322d8f16ed8cdfd8634419c6081617f2a3c4be85f767758d7c"} Feb 17 12:55:51.824246 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:51.824212 2573 generic.go:358] "Generic (PLEG): container finished" podID="3252cc01-47da-4470-ab0b-c36315dec301" containerID="83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39" exitCode=0 Feb 17 12:55:51.824631 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:51.824282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerDied","Data":"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39"} Feb 17 12:55:52.831065 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:52.831028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerStarted","Data":"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e"} Feb 17 12:55:52.831552 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:52.831074 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerStarted","Data":"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6"} Feb 17 12:55:52.831552 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:52.831087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerStarted","Data":"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3"} Feb 17 12:55:52.854739 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:52.854679 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podStartSLOduration=6.854656975 podStartE2EDuration="6.854656975s" podCreationTimestamp="2026-02-17 12:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:55:52.851150511 +0000 UTC m=+576.445965330" watchObservedRunningTime="2026-02-17 12:55:52.854656975 +0000 UTC m=+576.449471801" Feb 17 12:55:53.755143 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.755085 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:53.755302 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.755155 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:53.755302 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.755172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:53.757347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.757305 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.30:8816: connect: connection refused" Feb 17 12:55:53.757347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.757331 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" probeResult="failure" output="Get \"https://10.133.0.30:6567/health\": dial tcp 10.133.0.30:6567: connect: connection refused" Feb 17 12:55:53.757347 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:53.757333 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.30:8443: connect: connection refused" Feb 17 12:55:56.756020 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.755987 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.756529 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.756434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.756529 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.756482 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.756736 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.756716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.760650 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.760628 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.846715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.846683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.846715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.846713 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.847285 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.847264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:55:56.849580 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:55:56.849556 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:56:16.898596 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:16.898572 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:56:16.899496 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:16.899478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 12:56:16.905219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:16.905197 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:56:16.905838 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:16.905815 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 12:56:58.168988 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.168953 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:56:58.169434 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.169360 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" containerID="cri-o://c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3" gracePeriod=30 Feb 17 12:56:58.169541 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.169408 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" containerID="cri-o://cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e" gracePeriod=30 Feb 17 12:56:58.169541 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.169410 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="offline" containerID="cri-o://f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6" gracePeriod=30 Feb 17 12:56:58.271164 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.271135 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:56:58.271448 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.271425 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="registry" containerID="cri-o://f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" gracePeriod=30 Feb 17 12:56:58.271592 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.271522 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" containerID="cri-o://6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" gracePeriod=30 Feb 17 12:56:58.271683 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.271561 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="offline" containerID="cri-o://f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" gracePeriod=30 Feb 17 12:56:58.271917 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:58.271796 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" containerID="cri-o://a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" gracePeriod=30 Feb 17 12:56:59.048700 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.048670 2573 generic.go:358] "Generic (PLEG): container finished" podID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerID="6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" exitCode=0 Feb 17 12:56:59.048700 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.048694 2573 generic.go:358] "Generic (PLEG): container finished" podID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerID="a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" exitCode=0 Feb 17 12:56:59.048900 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.048742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880"} Feb 17 12:56:59.048900 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.048780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f"} Feb 17 12:56:59.050652 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.050630 2573 generic.go:358] "Generic (PLEG): container finished" podID="3252cc01-47da-4470-ab0b-c36315dec301" containerID="cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e" exitCode=0 Feb 17 12:56:59.050652 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.050648 2573 generic.go:358] "Generic (PLEG): container finished" podID="3252cc01-47da-4470-ab0b-c36315dec301" containerID="c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3" exitCode=0 Feb 17 12:56:59.050799 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.050701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerDied","Data":"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e"} Feb 17 12:56:59.050799 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:56:59.050726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerDied","Data":"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3"} Feb 17 12:57:05.189429 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:05.189387 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.28:8443: connect: connection refused" Feb 17 12:57:05.607152 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:05.607014 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" probeResult="failure" output="Get \"https://10.133.0.28:6567/health\": dial tcp 10.133.0.28:6567: connect: connection refused" Feb 17 12:57:06.847206 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:06.847161 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.30:8443: connect: connection refused" Feb 17 12:57:06.847206 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:06.847186 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" probeResult="failure" output="Get \"https://10.133.0.30:6567/health\": dial tcp 10.133.0.30:6567: connect: connection refused" Feb 17 12:57:15.190268 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:15.190228 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.28:8443: connect: connection refused" Feb 17 12:57:15.607474 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:15.607366 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" probeResult="failure" output="Get \"https://10.133.0.28:6567/health\": dial tcp 10.133.0.28:6567: connect: connection refused" Feb 17 12:57:16.846953 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:16.846906 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.30:8443: connect: connection refused" Feb 17 12:57:16.847368 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:16.846971 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" probeResult="failure" output="Get \"https://10.133.0.30:6567/health\": dial tcp 10.133.0.30:6567: connect: connection refused" Feb 17 12:57:25.189986 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:25.189939 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.28:8443: connect: connection refused" Feb 17 12:57:25.190479 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:25.190064 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:57:25.608015 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:25.607914 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" probeResult="failure" output="Get \"https://10.133.0.28:6567/health\": dial tcp 10.133.0.28:6567: connect: connection refused" Feb 17 12:57:25.608192 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:25.608083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:57:26.847323 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:26.847275 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.30:8443: connect: connection refused" Feb 17 12:57:26.847728 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:26.847335 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" probeResult="failure" output="Get \"https://10.133.0.30:6567/health\": dial tcp 10.133.0.30:6567: connect: connection refused" Feb 17 12:57:26.847728 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:26.847405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:57:26.847728 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:26.847445 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:57:28.380507 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.380486 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:57:28.495433 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495354 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495433 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495389 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495433 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495411 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495433 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495431 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495467 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbgw\" (UniqueName: \"kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495748 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495496 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls\") pod \"3252cc01-47da-4470-ab0b-c36315dec301\" (UID: \"3252cc01-47da-4470-ab0b-c36315dec301\") " Feb 17 12:57:28.495964 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.495935 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 12:57:28.496059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.496039 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data" (OuterVolumeSpecName: "feast-data") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 12:57:28.497798 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.497768 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw" (OuterVolumeSpecName: "kube-api-access-fmbgw") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "kube-api-access-fmbgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:57:28.497924 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.497791 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:28.497996 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.497964 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls" (OuterVolumeSpecName: "online-tls") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:28.498045 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.497988 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "3252cc01-47da-4470-ab0b-c36315dec301" (UID: "3252cc01-47da-4470-ab0b-c36315dec301"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:28.596292 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596261 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.596446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596301 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/3252cc01-47da-4470-ab0b-c36315dec301-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.596446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596312 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.596446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596321 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmbgw\" (UniqueName: \"kubernetes.io/projected/3252cc01-47da-4470-ab0b-c36315dec301-kube-api-access-fmbgw\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.596446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596330 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/3252cc01-47da-4470-ab0b-c36315dec301-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.596446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.596339 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/3252cc01-47da-4470-ab0b-c36315dec301-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:28.906183 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:28.906160 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:57:29.000163 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000163 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000168 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000207 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000249 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000299 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcjps\" (UniqueName: \"kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000320 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls\") pod \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\" (UID: \"57fb8008-7e68-48e7-9bde-4a9f7e6301a0\") " Feb 17 12:57:29.000903 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.000873 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data" (OuterVolumeSpecName: "feast-data") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 12:57:29.002449 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.002417 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:29.002555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.002485 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps" (OuterVolumeSpecName: "kube-api-access-qcjps") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "kube-api-access-qcjps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:57:29.002555 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.002505 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:29.002737 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.002716 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:29.002795 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.002736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls" (OuterVolumeSpecName: "online-tls") pod "57fb8008-7e68-48e7-9bde-4a9f7e6301a0" (UID: "57fb8008-7e68-48e7-9bde-4a9f7e6301a0"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101312 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101346 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101360 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101371 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101384 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcjps\" (UniqueName: \"kubernetes.io/projected/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-kube-api-access-qcjps\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.101408 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.101397 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/57fb8008-7e68-48e7-9bde-4a9f7e6301a0-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:57:29.147297 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147269 2573 generic.go:358] "Generic (PLEG): container finished" podID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerID="f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" exitCode=137 Feb 17 12:57:29.147297 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147293 2573 generic.go:358] "Generic (PLEG): container finished" podID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerID="f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" exitCode=137 Feb 17 12:57:29.147492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147334 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" Feb 17 12:57:29.147492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b"} Feb 17 12:57:29.147492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba"} Feb 17 12:57:29.147492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147394 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss" event={"ID":"57fb8008-7e68-48e7-9bde-4a9f7e6301a0","Type":"ContainerDied","Data":"783c8bd8a7ebf71f006aaddd68a20016f81b0fe4a6fb8fc486110064834a4d52"} Feb 17 12:57:29.147492 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.147408 2573 scope.go:117] "RemoveContainer" containerID="6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" Feb 17 12:57:29.149461 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.149437 2573 generic.go:358] "Generic (PLEG): container finished" podID="3252cc01-47da-4470-ab0b-c36315dec301" containerID="f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6" exitCode=137 Feb 17 12:57:29.149572 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.149504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerDied","Data":"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6"} Feb 17 12:57:29.149572 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.149526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" event={"ID":"3252cc01-47da-4470-ab0b-c36315dec301","Type":"ContainerDied","Data":"b6b08feb6bd06f322d8f16ed8cdfd8634419c6081617f2a3c4be85f767758d7c"} Feb 17 12:57:29.149686 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.149584 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h" Feb 17 12:57:29.156822 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.156807 2573 scope.go:117] "RemoveContainer" containerID="f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" Feb 17 12:57:29.164905 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.164887 2573 scope.go:117] "RemoveContainer" containerID="a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" Feb 17 12:57:29.168715 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.168690 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:57:29.171887 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.171865 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-558f8f8769-c7p6h"] Feb 17 12:57:29.174168 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.174031 2573 scope.go:117] "RemoveContainer" containerID="f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" Feb 17 12:57:29.182017 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.181993 2573 scope.go:117] "RemoveContainer" containerID="04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f" Feb 17 12:57:29.183096 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.183074 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:57:29.187885 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.187861 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-5887dd77db-qzdss"] Feb 17 12:57:29.194685 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.194670 2573 scope.go:117] "RemoveContainer" containerID="6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" Feb 17 12:57:29.194970 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.194952 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880\": container with ID starting with 6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880 not found: ID does not exist" containerID="6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" Feb 17 12:57:29.195012 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.194979 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880"} err="failed to get container status \"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880\": rpc error: code = NotFound desc = could not find container \"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880\": container with ID starting with 6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880 not found: ID does not exist" Feb 17 12:57:29.195012 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.195000 2573 scope.go:117] "RemoveContainer" containerID="f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" Feb 17 12:57:29.196204 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.196182 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b\": container with ID starting with f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b not found: ID does not exist" containerID="f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" Feb 17 12:57:29.196267 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196212 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b"} err="failed to get container status \"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b\": rpc error: code = NotFound desc = could not find container \"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b\": container with ID starting with f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b not found: ID does not exist" Feb 17 12:57:29.196267 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196228 2573 scope.go:117] "RemoveContainer" containerID="a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" Feb 17 12:57:29.196453 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.196436 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f\": container with ID starting with a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f not found: ID does not exist" containerID="a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" Feb 17 12:57:29.196497 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196458 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f"} err="failed to get container status \"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f\": rpc error: code = NotFound desc = could not find container \"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f\": container with ID starting with a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f not found: ID does not exist" Feb 17 12:57:29.196497 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196474 2573 scope.go:117] "RemoveContainer" containerID="f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" Feb 17 12:57:29.196713 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.196698 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba\": container with ID starting with f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba not found: ID does not exist" containerID="f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" Feb 17 12:57:29.196758 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196718 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba"} err="failed to get container status \"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba\": rpc error: code = NotFound desc = could not find container \"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba\": container with ID starting with f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba not found: ID does not exist" Feb 17 12:57:29.196758 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196734 2573 scope.go:117] "RemoveContainer" containerID="04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f" Feb 17 12:57:29.196950 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.196932 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f\": container with ID starting with 04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f not found: ID does not exist" containerID="04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f" Feb 17 12:57:29.196994 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196954 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f"} err="failed to get container status \"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f\": rpc error: code = NotFound desc = could not find container \"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f\": container with ID starting with 04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f not found: ID does not exist" Feb 17 12:57:29.196994 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.196968 2573 scope.go:117] "RemoveContainer" containerID="6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880" Feb 17 12:57:29.197202 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197185 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880"} err="failed to get container status \"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880\": rpc error: code = NotFound desc = could not find container \"6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880\": container with ID starting with 6aecdafcf79b3c60824636f1032ae0e8bee325fdbb8a29e7e0b0c7e59fb1b880 not found: ID does not exist" Feb 17 12:57:29.197242 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197203 2573 scope.go:117] "RemoveContainer" containerID="f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b" Feb 17 12:57:29.197412 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197395 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b"} err="failed to get container status \"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b\": rpc error: code = NotFound desc = could not find container \"f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b\": container with ID starting with f7bde4df7109c3a00e854dfab10f23d096d003b3b4b76f764a348a471fcbfa3b not found: ID does not exist" Feb 17 12:57:29.197451 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197412 2573 scope.go:117] "RemoveContainer" containerID="a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f" Feb 17 12:57:29.197628 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197611 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f"} err="failed to get container status \"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f\": rpc error: code = NotFound desc = could not find container \"a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f\": container with ID starting with a869a2dc3991d5f2361449a09b875a748ac3d5bc529f0632de350068a082757f not found: ID does not exist" Feb 17 12:57:29.197674 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197629 2573 scope.go:117] "RemoveContainer" containerID="f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba" Feb 17 12:57:29.197834 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197818 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba"} err="failed to get container status \"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba\": rpc error: code = NotFound desc = could not find container \"f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba\": container with ID starting with f3e48c3356185badff12f4c6881973ee3ffa2de198672e0bfa15e5bf104109ba not found: ID does not exist" Feb 17 12:57:29.197878 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.197835 2573 scope.go:117] "RemoveContainer" containerID="04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f" Feb 17 12:57:29.198047 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.198029 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f"} err="failed to get container status \"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f\": rpc error: code = NotFound desc = could not find container \"04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f\": container with ID starting with 04d8d0fc6333cbbdc166fe734723723e8e581f1a8bacb522c0f6449a8fa4201f not found: ID does not exist" Feb 17 12:57:29.198087 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.198048 2573 scope.go:117] "RemoveContainer" containerID="cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e" Feb 17 12:57:29.205767 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.205751 2573 scope.go:117] "RemoveContainer" containerID="f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6" Feb 17 12:57:29.212991 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.212970 2573 scope.go:117] "RemoveContainer" containerID="c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3" Feb 17 12:57:29.219932 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.219916 2573 scope.go:117] "RemoveContainer" containerID="83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39" Feb 17 12:57:29.231588 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.231574 2573 scope.go:117] "RemoveContainer" containerID="cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e" Feb 17 12:57:29.231821 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.231804 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e\": container with ID starting with cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e not found: ID does not exist" containerID="cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e" Feb 17 12:57:29.231870 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.231828 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e"} err="failed to get container status \"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e\": rpc error: code = NotFound desc = could not find container \"cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e\": container with ID starting with cf01aa08bc2caafbce2d504590f28a2fdd7a831a0ec0abe20518765fd355302e not found: ID does not exist" Feb 17 12:57:29.231870 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.231844 2573 scope.go:117] "RemoveContainer" containerID="f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6" Feb 17 12:57:29.232073 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.232048 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6\": container with ID starting with f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6 not found: ID does not exist" containerID="f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6" Feb 17 12:57:29.232134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.232079 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6"} err="failed to get container status \"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6\": rpc error: code = NotFound desc = could not find container \"f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6\": container with ID starting with f050cbe546fe6618b5b04ce5ae9b934d9e47838e989030287547f073e25916b6 not found: ID does not exist" Feb 17 12:57:29.232134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.232096 2573 scope.go:117] "RemoveContainer" containerID="c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3" Feb 17 12:57:29.232336 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.232320 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3\": container with ID starting with c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3 not found: ID does not exist" containerID="c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3" Feb 17 12:57:29.232375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.232340 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3"} err="failed to get container status \"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3\": rpc error: code = NotFound desc = could not find container \"c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3\": container with ID starting with c798910741a2b8f99f55531a7afef7b1e060b4726b6f39c6b57f336317dbc3c3 not found: ID does not exist" Feb 17 12:57:29.232375 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.232353 2573 scope.go:117] "RemoveContainer" containerID="83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39" Feb 17 12:57:29.232536 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:29.232517 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39\": container with ID starting with 83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39 not found: ID does not exist" containerID="83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39" Feb 17 12:57:29.232573 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:29.232540 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39"} err="failed to get container status \"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39\": rpc error: code = NotFound desc = could not find container \"83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39\": container with ID starting with 83def61436613c82d94b02d56f0d5209e117434b1aa471e3744cd340755aab39 not found: ID does not exist" Feb 17 12:57:30.933772 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:30.933733 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3252cc01-47da-4470-ab0b-c36315dec301" path="/var/lib/kubelet/pods/3252cc01-47da-4470-ab0b-c36315dec301/volumes" Feb 17 12:57:30.934269 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:30.934256 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" path="/var/lib/kubelet/pods/57fb8008-7e68-48e7-9bde-4a9f7e6301a0/volumes" Feb 17 12:57:41.080207 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080170 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:57:41.080554 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080510 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="registry" Feb 17 12:57:41.080554 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080523 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="registry" Feb 17 12:57:41.080554 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080532 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080559 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080570 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="offline" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080578 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="offline" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080592 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="feast-init" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080601 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="feast-init" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080609 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="offline" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080615 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="offline" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080633 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080638 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080649 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="feast-init" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080654 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="feast-init" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080659 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080665 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080673 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" Feb 17 12:57:41.080671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080678 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080731 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="ui" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080740 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="offline" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="offline" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080753 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="registry" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080759 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="ui" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080766 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3252cc01-47da-4470-ab0b-c36315dec301" containerName="online" Feb 17 12:57:41.081134 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.080773 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="57fb8008-7e68-48e7-9bde-4a9f7e6301a0" containerName="online" Feb 17 12:57:41.084090 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.084069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.086608 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.086578 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"openshift-service-ca.crt\"" Feb 17 12:57:41.086765 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.086670 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"kube-root-ca.crt\"" Feb 17 12:57:41.086765 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.086670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"postgres-secret\"" Feb 17 12:57:41.086879 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.086832 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"default-dockercfg-tgwxj\"" Feb 17 12:57:41.090313 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.090289 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:57:41.199982 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.199949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"postgresdata\" (UniqueName: \"kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.200180 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.199991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq9z\" (UniqueName: \"kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.301042 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.301005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"postgresdata\" (UniqueName: \"kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.301042 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.301050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snq9z\" (UniqueName: \"kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.301362 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.301342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"postgresdata\" (UniqueName: \"kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.309119 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.309092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snq9z\" (UniqueName: \"kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z\") pod \"postgres-7899fd5bfd-jct8h\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.400863 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.400835 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:57:41.515550 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:41.515518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:57:41.518470 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:57:41.518438 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33375932_8a8d_4a26_8c5b_80b29be005b1.slice/crio-adf2b9a4f49938459bfabec51aafb1f546f5c8f70eb28e99a7dab7ef68666b21 WatchSource:0}: Error finding container adf2b9a4f49938459bfabec51aafb1f546f5c8f70eb28e99a7dab7ef68666b21: Status 404 returned error can't find the container with id adf2b9a4f49938459bfabec51aafb1f546f5c8f70eb28e99a7dab7ef68666b21 Feb 17 12:57:42.192896 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:42.192843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" event={"ID":"33375932-8a8d-4a26-8c5b-80b29be005b1","Type":"ContainerStarted","Data":"adf2b9a4f49938459bfabec51aafb1f546f5c8f70eb28e99a7dab7ef68666b21"} Feb 17 12:57:46.209340 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:46.209303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" event={"ID":"33375932-8a8d-4a26-8c5b-80b29be005b1","Type":"ContainerStarted","Data":"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e"} Feb 17 12:57:46.223709 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:46.223643 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" podStartSLOduration=0.871023514 podStartE2EDuration="5.223623544s" podCreationTimestamp="2026-02-17 12:57:41 +0000 UTC" firstStartedPulling="2026-02-17 12:57:41.520085757 +0000 UTC m=+685.114900573" lastFinishedPulling="2026-02-17 12:57:45.872685796 +0000 UTC m=+689.467500603" observedRunningTime="2026-02-17 12:57:46.221993388 +0000 UTC m=+689.816808216" watchObservedRunningTime="2026-02-17 12:57:46.223623544 +0000 UTC m=+689.818438370" Feb 17 12:57:49.628513 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.628474 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:57:49.635648 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.635625 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.639352 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.638670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-credit-scoring-online-tls\"" Feb 17 12:57:49.639352 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.638948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-credit-scoring-registry-tls\"" Feb 17 12:57:49.639352 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.639178 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-credit-scoring-dockercfg-p26jk\"" Feb 17 12:57:49.639759 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.639740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-credit-scoring-offline-tls\"" Feb 17 12:57:49.640626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.640602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:57:49.782658 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.782617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8bv\" (UniqueName: \"kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.782826 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.782664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.782826 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.782760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.782826 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.782804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.782970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.782843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.883927 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.883837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8bv\" (UniqueName: \"kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.883927 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.883914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.884147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.883952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.884147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.883979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.884147 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.884128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.884291 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:49.884224 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-credit-scoring-registry-tls: secret "feast-credit-scoring-registry-tls" not found Feb 17 12:57:49.884291 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:57:49.884284 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls podName:70862f1c-9699-48e2-99c5-c58751ed00b1 nodeName:}" failed. No retries permitted until 2026-02-17 12:57:50.384267677 +0000 UTC m=+693.979082480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls") pod "feast-credit-scoring-7dbccc8456-nblgz" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1") : secret "feast-credit-scoring-registry-tls" not found Feb 17 12:57:49.884410 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.884393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.886575 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.886543 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.886684 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.886599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:49.891831 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:49.891812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8bv\" (UniqueName: \"kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:50.390048 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:50.390005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:50.392480 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:50.392457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") pod \"feast-credit-scoring-7dbccc8456-nblgz\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:50.547586 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:50.547551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:50.670594 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:50.670562 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:57:50.670968 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:57:50.670823 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70862f1c_9699_48e2_99c5_c58751ed00b1.slice/crio-e341f45f36c47771444ee1ea6d76dd82030d71d90d61b29acfa21e9c26fab3af WatchSource:0}: Error finding container e341f45f36c47771444ee1ea6d76dd82030d71d90d61b29acfa21e9c26fab3af: Status 404 returned error can't find the container with id e341f45f36c47771444ee1ea6d76dd82030d71d90d61b29acfa21e9c26fab3af Feb 17 12:57:50.672790 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:50.672775 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 12:57:51.226547 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:51.226513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0"} Feb 17 12:57:51.226735 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:51.226557 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"e341f45f36c47771444ee1ea6d76dd82030d71d90d61b29acfa21e9c26fab3af"} Feb 17 12:57:52.230634 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:52.230603 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0" exitCode=0 Feb 17 12:57:52.231006 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:52.230646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0"} Feb 17 12:57:53.237568 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.237526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3"} Feb 17 12:57:53.237568 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.237573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"973be92a20ce1941d6f2c30b0d68b3380ea97b1307c6006c17d26c7b6bb514f9"} Feb 17 12:57:53.238103 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.237587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888"} Feb 17 12:57:53.258547 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.258484 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podStartSLOduration=4.258463298 podStartE2EDuration="4.258463298s" podCreationTimestamp="2026-02-17 12:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 12:57:53.256637741 +0000 UTC m=+696.851452578" watchObservedRunningTime="2026-02-17 12:57:53.258463298 +0000 UTC m=+696.853278124" Feb 17 12:57:53.548786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.548689 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:53.548786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.548742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:53.548786 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.548756 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:53.550513 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.550458 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.32:6571: connect: connection refused" Feb 17 12:57:53.550660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.550511 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:57:53.550660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:53.550488 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.32:8816: connect: connection refused" Feb 17 12:57:56.548802 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:56.548756 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.32:6571: connect: connection refused" Feb 17 12:57:56.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:56.548797 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:57:56.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:56.548757 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.32:8816: connect: connection refused" Feb 17 12:57:57.255239 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255209 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" exitCode=1 Feb 17 12:57:57.255239 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255232 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="973be92a20ce1941d6f2c30b0d68b3380ea97b1307c6006c17d26c7b6bb514f9" exitCode=1 Feb 17 12:57:57.255446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3"} Feb 17 12:57:57.255446 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"973be92a20ce1941d6f2c30b0d68b3380ea97b1307c6006c17d26c7b6bb514f9"} Feb 17 12:57:57.255908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255891 2573 scope.go:117] "RemoveContainer" containerID="973be92a20ce1941d6f2c30b0d68b3380ea97b1307c6006c17d26c7b6bb514f9" Feb 17 12:57:57.255956 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:57.255919 2573 scope.go:117] "RemoveContainer" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" Feb 17 12:57:58.263557 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:58.263464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6"} Feb 17 12:57:58.263557 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:58.263519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerStarted","Data":"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6"} Feb 17 12:57:59.548742 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:59.548686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:59.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:59.548754 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:57:59.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:59.548743 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.32:8816: connect: connection refused" Feb 17 12:57:59.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:59.548756 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.32:6571: connect: connection refused" Feb 17 12:57:59.549245 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:57:59.548749 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:58:00.547835 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:00.547798 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:00.548043 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:00.547848 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:00.548043 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:00.547861 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:02.549334 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:02.549293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:02.549731 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:02.549405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:02.550046 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:02.550019 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:02.550217 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:02.550200 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:02.554232 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:02.554211 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:03.283972 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:03.283947 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:58:46.677028 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.676993 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:58:46.677479 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.677323 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" containerID="cri-o://e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" gracePeriod=30 Feb 17 12:58:46.677479 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.677391 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" containerID="cri-o://93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" gracePeriod=30 Feb 17 12:58:46.677581 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.677395 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" containerID="cri-o://e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" gracePeriod=30 Feb 17 12:58:46.695626 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.695600 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:58:46.695780 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:46.695763 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" podUID="33375932-8a8d-4a26-8c5b-80b29be005b1" containerName="postgres" containerID="cri-o://27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e" gracePeriod=30 Feb 17 12:58:47.420745 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:47.420716 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" exitCode=0 Feb 17 12:58:47.420908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:47.420786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6"} Feb 17 12:58:47.420908 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:47.420830 2573 scope.go:117] "RemoveContainer" containerID="973be92a20ce1941d6f2c30b0d68b3380ea97b1307c6006c17d26c7b6bb514f9" Feb 17 12:58:53.281920 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:58:53.281868 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:59:03.281622 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:03.281576 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:59:13.281955 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:13.281912 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" probeResult="failure" output="Get \"https://10.133.0.32:6567/health\": dial tcp 10.133.0.32:6567: connect: connection refused" Feb 17 12:59:13.282340 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:13.282046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:59:16.889292 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:16.889269 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:59:16.956744 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:16.956674 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"postgresdata\" (UniqueName: \"kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata\") pod \"33375932-8a8d-4a26-8c5b-80b29be005b1\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " Feb 17 12:59:16.956901 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:16.956776 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snq9z\" (UniqueName: \"kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z\") pod \"33375932-8a8d-4a26-8c5b-80b29be005b1\" (UID: \"33375932-8a8d-4a26-8c5b-80b29be005b1\") " Feb 17 12:59:16.956970 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:16.956907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata" (OuterVolumeSpecName: "postgresdata") pod "33375932-8a8d-4a26-8c5b-80b29be005b1" (UID: "33375932-8a8d-4a26-8c5b-80b29be005b1"). InnerVolumeSpecName "postgresdata". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 12:59:16.958552 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:16.958529 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z" (OuterVolumeSpecName: "kube-api-access-snq9z") pod "33375932-8a8d-4a26-8c5b-80b29be005b1" (UID: "33375932-8a8d-4a26-8c5b-80b29be005b1"). InnerVolumeSpecName "kube-api-access-snq9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:59:17.057711 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.057680 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snq9z\" (UniqueName: \"kubernetes.io/projected/33375932-8a8d-4a26-8c5b-80b29be005b1-kube-api-access-snq9z\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.057947 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.057925 2573 reconciler_common.go:299] "Volume detached for volume \"postgresdata\" (UniqueName: \"kubernetes.io/empty-dir/33375932-8a8d-4a26-8c5b-80b29be005b1-postgresdata\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.311522 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.311497 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:59:17.360148 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.360099 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") pod \"70862f1c-9699-48e2-99c5-c58751ed00b1\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " Feb 17 12:59:17.360314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.360174 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls\") pod \"70862f1c-9699-48e2-99c5-c58751ed00b1\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " Feb 17 12:59:17.360314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.360217 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls\") pod \"70862f1c-9699-48e2-99c5-c58751ed00b1\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " Feb 17 12:59:17.360314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.360280 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l8bv\" (UniqueName: \"kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv\") pod \"70862f1c-9699-48e2-99c5-c58751ed00b1\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " Feb 17 12:59:17.360470 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.360338 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data\") pod \"70862f1c-9699-48e2-99c5-c58751ed00b1\" (UID: \"70862f1c-9699-48e2-99c5-c58751ed00b1\") " Feb 17 12:59:17.362334 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.362301 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv" (OuterVolumeSpecName: "kube-api-access-9l8bv") pod "70862f1c-9699-48e2-99c5-c58751ed00b1" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1"). InnerVolumeSpecName "kube-api-access-9l8bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:59:17.362525 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.362509 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "70862f1c-9699-48e2-99c5-c58751ed00b1" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:59:17.362689 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.362666 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls" (OuterVolumeSpecName: "online-tls") pod "70862f1c-9699-48e2-99c5-c58751ed00b1" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:59:17.362689 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.362681 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "70862f1c-9699-48e2-99c5-c58751ed00b1" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 12:59:17.370213 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.370189 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data" (OuterVolumeSpecName: "feast-data") pod "70862f1c-9699-48e2-99c5-c58751ed00b1" (UID: "70862f1c-9699-48e2-99c5-c58751ed00b1"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 12:59:17.461219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.461180 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/70862f1c-9699-48e2-99c5-c58751ed00b1-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.461219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.461216 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.461219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.461225 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.461219 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.461234 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/70862f1c-9699-48e2-99c5-c58751ed00b1-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.461454 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.461245 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9l8bv\" (UniqueName: \"kubernetes.io/projected/70862f1c-9699-48e2-99c5-c58751ed00b1-kube-api-access-9l8bv\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:17.515576 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.515501 2573 generic.go:358] "Generic (PLEG): container finished" podID="33375932-8a8d-4a26-8c5b-80b29be005b1" containerID="27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e" exitCode=137 Feb 17 12:59:17.515576 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.515532 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" event={"ID":"33375932-8a8d-4a26-8c5b-80b29be005b1","Type":"ContainerDied","Data":"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e"} Feb 17 12:59:17.515576 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.515566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" event={"ID":"33375932-8a8d-4a26-8c5b-80b29be005b1","Type":"ContainerDied","Data":"adf2b9a4f49938459bfabec51aafb1f546f5c8f70eb28e99a7dab7ef68666b21"} Feb 17 12:59:17.515576 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.515571 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/postgres-7899fd5bfd-jct8h" Feb 17 12:59:17.515868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.515581 2573 scope.go:117] "RemoveContainer" containerID="27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e" Feb 17 12:59:17.517887 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517865 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" exitCode=137 Feb 17 12:59:17.517887 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517884 2573 generic.go:358] "Generic (PLEG): container finished" podID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerID="e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" exitCode=137 Feb 17 12:59:17.518044 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517936 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" Feb 17 12:59:17.518044 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6"} Feb 17 12:59:17.518044 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888"} Feb 17 12:59:17.518044 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.517986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz" event={"ID":"70862f1c-9699-48e2-99c5-c58751ed00b1","Type":"ContainerDied","Data":"e341f45f36c47771444ee1ea6d76dd82030d71d90d61b29acfa21e9c26fab3af"} Feb 17 12:59:17.536971 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.536950 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:59:17.538619 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.538600 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/postgres-7899fd5bfd-jct8h"] Feb 17 12:59:17.548825 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.548798 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:59:17.551878 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.551857 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-credit-scoring-7dbccc8456-nblgz"] Feb 17 12:59:17.557349 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.557331 2573 scope.go:117] "RemoveContainer" containerID="27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e" Feb 17 12:59:17.557625 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.557609 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e\": container with ID starting with 27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e not found: ID does not exist" containerID="27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e" Feb 17 12:59:17.557691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.557632 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e"} err="failed to get container status \"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e\": rpc error: code = NotFound desc = could not find container \"27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e\": container with ID starting with 27889f3125f09cb3ca8c023ec1830d61266644910aaacb617b3ff5e416a6979e not found: ID does not exist" Feb 17 12:59:17.557691 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.557653 2573 scope.go:117] "RemoveContainer" containerID="e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" Feb 17 12:59:17.565056 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.565037 2573 scope.go:117] "RemoveContainer" containerID="93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" Feb 17 12:59:17.572042 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.572025 2573 scope.go:117] "RemoveContainer" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" Feb 17 12:59:17.583681 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.583660 2573 scope.go:117] "RemoveContainer" containerID="e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" Feb 17 12:59:17.590971 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.590953 2573 scope.go:117] "RemoveContainer" containerID="01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0" Feb 17 12:59:17.598078 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598060 2573 scope.go:117] "RemoveContainer" containerID="e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" Feb 17 12:59:17.598346 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.598327 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6\": container with ID starting with e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6 not found: ID does not exist" containerID="e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" Feb 17 12:59:17.598416 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598357 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6"} err="failed to get container status \"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6\": rpc error: code = NotFound desc = could not find container \"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6\": container with ID starting with e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6 not found: ID does not exist" Feb 17 12:59:17.598416 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598384 2573 scope.go:117] "RemoveContainer" containerID="93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" Feb 17 12:59:17.598632 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.598614 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6\": container with ID starting with 93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6 not found: ID does not exist" containerID="93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" Feb 17 12:59:17.598677 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598637 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6"} err="failed to get container status \"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6\": rpc error: code = NotFound desc = could not find container \"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6\": container with ID starting with 93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6 not found: ID does not exist" Feb 17 12:59:17.598677 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598653 2573 scope.go:117] "RemoveContainer" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" Feb 17 12:59:17.598857 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.598841 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3\": container with ID starting with 7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3 not found: ID does not exist" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" Feb 17 12:59:17.598900 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598862 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3"} err="failed to get container status \"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3\": rpc error: code = NotFound desc = could not find container \"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3\": container with ID starting with 7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3 not found: ID does not exist" Feb 17 12:59:17.598900 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.598877 2573 scope.go:117] "RemoveContainer" containerID="e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" Feb 17 12:59:17.599128 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.599092 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888\": container with ID starting with e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888 not found: ID does not exist" containerID="e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" Feb 17 12:59:17.599261 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599239 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888"} err="failed to get container status \"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888\": rpc error: code = NotFound desc = could not find container \"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888\": container with ID starting with e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888 not found: ID does not exist" Feb 17 12:59:17.599307 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599265 2573 scope.go:117] "RemoveContainer" containerID="01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0" Feb 17 12:59:17.599470 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:17.599455 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0\": container with ID starting with 01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0 not found: ID does not exist" containerID="01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0" Feb 17 12:59:17.599515 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599473 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0"} err="failed to get container status \"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0\": rpc error: code = NotFound desc = could not find container \"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0\": container with ID starting with 01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0 not found: ID does not exist" Feb 17 12:59:17.599515 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599487 2573 scope.go:117] "RemoveContainer" containerID="e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6" Feb 17 12:59:17.599687 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599667 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6"} err="failed to get container status \"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6\": rpc error: code = NotFound desc = could not find container \"e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6\": container with ID starting with e9e2f994fc5ba17b3c4b7f3184c92185818ffccd19da432209b76ee28c4177c6 not found: ID does not exist" Feb 17 12:59:17.599739 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599687 2573 scope.go:117] "RemoveContainer" containerID="93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6" Feb 17 12:59:17.599892 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599876 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6"} err="failed to get container status \"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6\": rpc error: code = NotFound desc = could not find container \"93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6\": container with ID starting with 93a805b6da5e3ca8dd6c1a78c258523131a9321c0dc44d7fd809bafd4c637fe6 not found: ID does not exist" Feb 17 12:59:17.599940 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.599893 2573 scope.go:117] "RemoveContainer" containerID="7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3" Feb 17 12:59:17.600099 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.600080 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3"} err="failed to get container status \"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3\": rpc error: code = NotFound desc = could not find container \"7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3\": container with ID starting with 7d49c1bda33b1a25cf6f939d6d7db77d68fad2b3c84e93c1c347346703ddc7c3 not found: ID does not exist" Feb 17 12:59:17.600197 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.600100 2573 scope.go:117] "RemoveContainer" containerID="e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888" Feb 17 12:59:17.600469 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.600449 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888"} err="failed to get container status \"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888\": rpc error: code = NotFound desc = could not find container \"e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888\": container with ID starting with e26710922901b42626b0bd700639bd8b13b4436a6a6dc7ce188fcc65bd232888 not found: ID does not exist" Feb 17 12:59:17.600545 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.600470 2573 scope.go:117] "RemoveContainer" containerID="01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0" Feb 17 12:59:17.600675 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:17.600657 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0"} err="failed to get container status \"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0\": rpc error: code = NotFound desc = could not find container \"01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0\": container with ID starting with 01489654a450678830a2455334f5232efafe51660f57b3753c683053f72816b0 not found: ID does not exist" Feb 17 12:59:18.933582 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:18.933548 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33375932-8a8d-4a26-8c5b-80b29be005b1" path="/var/lib/kubelet/pods/33375932-8a8d-4a26-8c5b-80b29be005b1/volumes" Feb 17 12:59:18.933951 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:18.933930 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" path="/var/lib/kubelet/pods/70862f1c-9699-48e2-99c5-c58751ed00b1/volumes" Feb 17 12:59:23.915174 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:23.915139 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:59:23.915559 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:23.915372 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" podUID="52e1160d-5658-43c9-b4a3-eee99ff456aa" containerName="manager" containerID="cri-o://b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d" gracePeriod=10 Feb 17 12:59:24.160714 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.160694 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:59:24.319861 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.319775 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m96jf\" (UniqueName: \"kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf\") pod \"52e1160d-5658-43c9-b4a3-eee99ff456aa\" (UID: \"52e1160d-5658-43c9-b4a3-eee99ff456aa\") " Feb 17 12:59:24.321916 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.321890 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf" (OuterVolumeSpecName: "kube-api-access-m96jf") pod "52e1160d-5658-43c9-b4a3-eee99ff456aa" (UID: "52e1160d-5658-43c9-b4a3-eee99ff456aa"). InnerVolumeSpecName "kube-api-access-m96jf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 12:59:24.420957 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.420910 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m96jf\" (UniqueName: \"kubernetes.io/projected/52e1160d-5658-43c9-b4a3-eee99ff456aa-kube-api-access-m96jf\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 12:59:24.542843 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.542807 2573 generic.go:358] "Generic (PLEG): container finished" podID="52e1160d-5658-43c9-b4a3-eee99ff456aa" containerID="b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d" exitCode=0 Feb 17 12:59:24.543039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.542856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" event={"ID":"52e1160d-5658-43c9-b4a3-eee99ff456aa","Type":"ContainerDied","Data":"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d"} Feb 17 12:59:24.543039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.542868 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" Feb 17 12:59:24.543039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.542882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l" event={"ID":"52e1160d-5658-43c9-b4a3-eee99ff456aa","Type":"ContainerDied","Data":"4526cbba45fe191da2e738a22524229a75dfbdbbdb77f36b40db60672b1b39ec"} Feb 17 12:59:24.543039 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.542898 2573 scope.go:117] "RemoveContainer" containerID="b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d" Feb 17 12:59:24.551941 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.551918 2573 scope.go:117] "RemoveContainer" containerID="b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d" Feb 17 12:59:24.552207 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:24.552188 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d\": container with ID starting with b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d not found: ID does not exist" containerID="b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d" Feb 17 12:59:24.552281 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.552218 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d"} err="failed to get container status \"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d\": rpc error: code = NotFound desc = could not find container \"b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d\": container with ID starting with b09dc3368172acfdd288925bf3e20b6195696ddfba07f81906d0531e97ca9e1d not found: ID does not exist" Feb 17 12:59:24.566228 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.566204 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:59:24.569455 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.569436 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-8c74c7748-lm82l"] Feb 17 12:59:24.933251 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:24.933220 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e1160d-5658-43c9-b4a3-eee99ff456aa" path="/var/lib/kubelet/pods/52e1160d-5658-43c9-b4a3-eee99ff456aa/volumes" Feb 17 12:59:33.626500 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626422 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626728 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626740 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626750 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626756 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626767 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626773 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626780 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626784 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626792 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33375932-8a8d-4a26-8c5b-80b29be005b1" containerName="postgres" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626797 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="33375932-8a8d-4a26-8c5b-80b29be005b1" containerName="postgres" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626805 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="feast-init" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626809 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="feast-init" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626817 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52e1160d-5658-43c9-b4a3-eee99ff456aa" containerName="manager" Feb 17 12:59:33.626849 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626822 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e1160d-5658-43c9-b4a3-eee99ff456aa" containerName="manager" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626870 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="52e1160d-5658-43c9-b4a3-eee99ff456aa" containerName="manager" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626880 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626886 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626893 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="registry" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626899 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="33375932-8a8d-4a26-8c5b-80b29be005b1" containerName="postgres" Feb 17 12:59:33.627395 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.626905 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="online" Feb 17 12:59:33.629655 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.629638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:33.632031 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.632009 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"feast-operator-system\"/\"feast-operator-controller-manager-dockercfg-qqnqs\"" Feb 17 12:59:33.632145 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.632078 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"openshift-service-ca.crt\"" Feb 17 12:59:33.633374 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.633356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"kube-root-ca.crt\"" Feb 17 12:59:33.637943 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.637922 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 12:59:33.801805 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.801766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkpb\" (UniqueName: \"kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb\") pod \"feast-operator-controller-manager-6984f6c56-n68c6\" (UID: \"0dd8a7cf-bc8a-4865-b648-f720263400c5\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:33.903149 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.903101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkpb\" (UniqueName: \"kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb\") pod \"feast-operator-controller-manager-6984f6c56-n68c6\" (UID: \"0dd8a7cf-bc8a-4865-b648-f720263400c5\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:33.912013 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.911985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkpb\" (UniqueName: \"kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb\") pod \"feast-operator-controller-manager-6984f6c56-n68c6\" (UID: \"0dd8a7cf-bc8a-4865-b648-f720263400c5\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:33.941006 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:33.940981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:34.055750 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:34.055725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 12:59:34.058498 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:59:34.058466 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dd8a7cf_bc8a_4865_b648_f720263400c5.slice/crio-3a58093024be584389abf5a20f005278585690df36c448265d698270d8826ff1 WatchSource:0}: Error finding container 3a58093024be584389abf5a20f005278585690df36c448265d698270d8826ff1: Status 404 returned error can't find the container with id 3a58093024be584389abf5a20f005278585690df36c448265d698270d8826ff1 Feb 17 12:59:34.577138 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:34.577087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" event={"ID":"0dd8a7cf-bc8a-4865-b648-f720263400c5","Type":"ContainerStarted","Data":"3a58093024be584389abf5a20f005278585690df36c448265d698270d8826ff1"} Feb 17 12:59:36.585479 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:36.585442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" event={"ID":"0dd8a7cf-bc8a-4865-b648-f720263400c5","Type":"ContainerStarted","Data":"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12"} Feb 17 12:59:36.585846 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:36.585526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:36.602796 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:36.602758 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" podStartSLOduration=2.012854492 podStartE2EDuration="3.602748088s" podCreationTimestamp="2026-02-17 12:59:33 +0000 UTC" firstStartedPulling="2026-02-17 12:59:34.060656452 +0000 UTC m=+797.655471268" lastFinishedPulling="2026-02-17 12:59:35.650550057 +0000 UTC m=+799.245364864" observedRunningTime="2026-02-17 12:59:36.602550384 +0000 UTC m=+800.197365210" watchObservedRunningTime="2026-02-17 12:59:36.602748088 +0000 UTC m=+800.197562913" Feb 17 12:59:47.590603 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:47.590570 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 12:59:49.465868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.459978 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 12:59:49.467026 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.466998 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:49.467026 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.467023 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:49.467194 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.467126 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="70862f1c-9699-48e2-99c5-c58751ed00b1" containerName="offline" Feb 17 12:59:49.471046 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.471027 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.474620 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.474596 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-registry-tls\"" Feb 17 12:59:49.474979 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.474959 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-ui-tls\"" Feb 17 12:59:49.475177 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.475157 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-df84n\"" Feb 17 12:59:49.475542 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.474977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"kube-root-ca.crt\"" Feb 17 12:59:49.475671 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.475361 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-offline-tls\"" Feb 17 12:59:49.475774 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.475366 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-online-tls\"" Feb 17 12:59:49.475914 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.475899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"openshift-service-ca.crt\"" Feb 17 12:59:49.482156 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.482028 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 12:59:49.524904 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.524855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.525059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.524932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.525059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.525008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4vfj\" (UniqueName: \"kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.525059 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.525040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.525255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.525076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.525255 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.525170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626466 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626660 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.626680 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-online-tls: secret "feast-simple-feast-setup-online-tls" not found Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.626697 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.626751 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls podName:4c725099-ff99-4b9b-81dc-04319606d380 nodeName:}" failed. No retries permitted until 2026-02-17 12:59:50.126729878 +0000 UTC m=+813.721544684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls") pod "feast-simple-feast-setup-565c46b746-5p2q7" (UID: "4c725099-ff99-4b9b-81dc-04319606d380") : secret "feast-simple-feast-setup-online-tls" not found Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.626770 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls podName:4c725099-ff99-4b9b-81dc-04319606d380 nodeName:}" failed. No retries permitted until 2026-02-17 12:59:50.126760637 +0000 UTC m=+813.721575445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls") pod "feast-simple-feast-setup-565c46b746-5p2q7" (UID: "4c725099-ff99-4b9b-81dc-04319606d380") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4vfj\" (UniqueName: \"kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.626835 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.626821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.627174 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.627148 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-offline-tls: secret "feast-simple-feast-setup-offline-tls" not found Feb 17 12:59:49.627243 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:49.627199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls podName:4c725099-ff99-4b9b-81dc-04319606d380 nodeName:}" failed. No retries permitted until 2026-02-17 12:59:50.127182415 +0000 UTC m=+813.721997232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls") pod "feast-simple-feast-setup-565c46b746-5p2q7" (UID: "4c725099-ff99-4b9b-81dc-04319606d380") : secret "feast-simple-feast-setup-offline-tls" not found Feb 17 12:59:49.627314 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.627247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.629474 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.629453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:49.636983 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:49.636960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4vfj\" (UniqueName: \"kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:50.131204 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:50.131173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:50.131204 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:50.131210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:50.131462 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:50.131270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:50.131462 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:50.131314 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:59:50.131462 ip-10-0-131-216 kubenswrapper[2573]: E0217 12:59:50.131386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls podName:4c725099-ff99-4b9b-81dc-04319606d380 nodeName:}" failed. No retries permitted until 2026-02-17 12:59:51.131369191 +0000 UTC m=+814.726183997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls") pod "feast-simple-feast-setup-565c46b746-5p2q7" (UID: "4c725099-ff99-4b9b-81dc-04319606d380") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 12:59:50.133566 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:50.133541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:50.133672 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:50.133643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:51.141750 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:51.141701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:51.144178 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:51.144154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5p2q7\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:51.292868 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:51.292833 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 12:59:51.412348 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:51.412319 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 12:59:51.413702 ip-10-0-131-216 kubenswrapper[2573]: W0217 12:59:51.413671 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c725099_ff99_4b9b_81dc_04319606d380.slice/crio-b761c64be74ca9f97a0aaddde54a72848cc8c0d8203c0c35241af0fca80b22bb WatchSource:0}: Error finding container b761c64be74ca9f97a0aaddde54a72848cc8c0d8203c0c35241af0fca80b22bb: Status 404 returned error can't find the container with id b761c64be74ca9f97a0aaddde54a72848cc8c0d8203c0c35241af0fca80b22bb Feb 17 12:59:51.632858 ip-10-0-131-216 kubenswrapper[2573]: I0217 12:59:51.632813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"b761c64be74ca9f97a0aaddde54a72848cc8c0d8203c0c35241af0fca80b22bb"} Feb 17 13:00:07.702455 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:07.702426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7"} Feb 17 13:00:11.719002 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:11.718971 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c725099-ff99-4b9b-81dc-04319606d380" containerID="1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7" exitCode=0 Feb 17 13:00:11.719002 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:11.719007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7"} Feb 17 13:00:12.727131 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:12.727069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18"} Feb 17 13:00:12.727618 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:12.727140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6"} Feb 17 13:00:12.727618 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:12.727158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9"} Feb 17 13:00:12.727618 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:12.727171 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerStarted","Data":"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779"} Feb 17 13:00:12.762342 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:12.761333 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podStartSLOduration=7.540774269 podStartE2EDuration="23.761313582s" podCreationTimestamp="2026-02-17 12:59:49 +0000 UTC" firstStartedPulling="2026-02-17 12:59:51.41552099 +0000 UTC m=+815.010335794" lastFinishedPulling="2026-02-17 13:00:07.63606029 +0000 UTC m=+831.230875107" observedRunningTime="2026-02-17 13:00:12.758695645 +0000 UTC m=+836.353510484" watchObservedRunningTime="2026-02-17 13:00:12.761313582 +0000 UTC m=+836.356128408" Feb 17 13:00:15.293400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.293354 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:15.293400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.293407 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:15.293926 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.293425 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:15.293926 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.293438 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:15.295658 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.295623 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.34:6571: connect: connection refused" Feb 17 13:00:15.295895 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.295646 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.34:8816: connect: connection refused" Feb 17 13:00:15.295972 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.295696 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.34:8443: connect: connection refused" Feb 17 13:00:15.295972 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:15.295725 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" probeResult="failure" output="Get \"https://10.133.0.34:6567/health\": dial tcp 10.133.0.34:6567: connect: connection refused" Feb 17 13:00:18.294166 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.294583 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294221 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.294583 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294435 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.294758 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294702 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.294758 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.294758 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.294751 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.295032 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.295010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.295183 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.295166 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.295236 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.295213 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.298523 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.298506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.748105 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.748076 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:00:18.755768 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:00:18.755738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:01:16.922736 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:16.922655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 13:01:16.923913 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:16.923894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 13:01:16.929500 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:16.929473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 13:01:16.930739 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:16.930717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 13:01:21.350285 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.350253 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 13:01:21.350690 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.350569 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="registry" containerID="cri-o://a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" gracePeriod=30 Feb 17 13:01:21.350690 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.350604 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" containerID="cri-o://8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" gracePeriod=30 Feb 17 13:01:21.350806 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.350672 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="offline" containerID="cri-o://5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" gracePeriod=30 Feb 17 13:01:21.350806 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.350715 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" containerID="cri-o://2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" gracePeriod=30 Feb 17 13:01:21.582377 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.582341 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:01:21.587140 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.587098 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.590407 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.590240 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-mfcv9\"" Feb 17 13:01:21.600981 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.600904 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:01:21.739968 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.739914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.739968 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.739971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.740270 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.740058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.740270 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.740134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4m2\" (UniqueName: \"kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.740270 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.740187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.740270 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.740213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841369 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4m2\" (UniqueName: \"kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841521 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.841557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.841820 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:21.841683 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-offline-tls: secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:01:21.841820 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:21.841742 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls podName:5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e nodeName:}" failed. No retries permitted until 2026-02-17 13:01:22.341721219 +0000 UTC m=+905.936536037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls") pod "feast-simple-feast-setup-565c46b746-tp5bg" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e") : secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:01:21.842304 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.842279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.842431 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:21.842364 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 13:01:21.842431 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:21.842425 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls podName:5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e nodeName:}" failed. No retries permitted until 2026-02-17 13:01:22.342408389 +0000 UTC m=+905.937223210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls") pod "feast-simple-feast-setup-565c46b746-tp5bg" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 13:01:21.844835 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.844811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.845285 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.845266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:21.850276 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:21.850251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4m2\" (UniqueName: \"kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.345226 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.345106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.345226 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.345201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.347626 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.347599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.347752 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.347717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-tp5bg\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.503928 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.503887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:22.628673 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.628638 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:01:22.629773 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:01:22.629751 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f341b4d_1c1e_4ae7_a80c_8f66a5eb896e.slice/crio-310003a21069de67e3cffeb95314eaec359931ecf02d04ebcf666204e9721d17 WatchSource:0}: Error finding container 310003a21069de67e3cffeb95314eaec359931ecf02d04ebcf666204e9721d17: Status 404 returned error can't find the container with id 310003a21069de67e3cffeb95314eaec359931ecf02d04ebcf666204e9721d17 Feb 17 13:01:22.939654 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.939623 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c725099-ff99-4b9b-81dc-04319606d380" containerID="8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" exitCode=0 Feb 17 13:01:22.939654 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.939648 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c725099-ff99-4b9b-81dc-04319606d380" containerID="2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" exitCode=0 Feb 17 13:01:22.939871 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.939698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18"} Feb 17 13:01:22.939871 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.939740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9"} Feb 17 13:01:22.940919 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.940893 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e"} Feb 17 13:01:22.941050 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:22.940924 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"310003a21069de67e3cffeb95314eaec359931ecf02d04ebcf666204e9721d17"} Feb 17 13:01:26.954171 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:26.954139 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerID="32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e" exitCode=0 Feb 17 13:01:26.954552 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:26.954142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e"} Feb 17 13:01:27.962206 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:27.962161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336"} Feb 17 13:01:27.962686 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:27.962232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019"} Feb 17 13:01:27.962686 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:27.962248 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22"} Feb 17 13:01:27.962686 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:27.962262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerStarted","Data":"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede"} Feb 17 13:01:27.986405 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:27.986340 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podStartSLOduration=6.986322163 podStartE2EDuration="6.986322163s" podCreationTimestamp="2026-02-17 13:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:01:27.982718706 +0000 UTC m=+911.577533532" watchObservedRunningTime="2026-02-17 13:01:27.986322163 +0000 UTC m=+911.581136989" Feb 17 13:01:28.295650 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.295226 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.34:8443: connect: connection refused" Feb 17 13:01:28.505152 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.505053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:28.505152 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.505104 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:28.505152 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.505134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:28.505152 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.505146 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:28.507365 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.507315 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.35:8816: connect: connection refused" Feb 17 13:01:28.507365 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.507341 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.35:6571: connect: connection refused" Feb 17 13:01:28.507565 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.507359 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.35:8443: connect: connection refused" Feb 17 13:01:28.507565 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.507431 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" probeResult="failure" output="Get \"https://10.133.0.35:6567/health\": dial tcp 10.133.0.35:6567: connect: connection refused" Feb 17 13:01:28.748623 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:28.748576 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" probeResult="failure" output="Get \"https://10.133.0.34:6567/health\": dial tcp 10.133.0.34:6567: connect: connection refused" Feb 17 13:01:31.505946 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.505909 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506523 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506523 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506236 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506695 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506521 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506695 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506695 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506563 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506910 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506825 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.506910 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.506878 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.507027 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.507010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.510811 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.510788 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.977752 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.977724 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:31.980648 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:31.980627 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:01:38.294966 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:38.294911 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.34:8443: connect: connection refused" Feb 17 13:01:38.749071 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:38.749028 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" probeResult="failure" output="Get \"https://10.133.0.34:6567/health\": dial tcp 10.133.0.34:6567: connect: connection refused" Feb 17 13:01:48.295279 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:48.295227 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.34:8443: connect: connection refused" Feb 17 13:01:48.295788 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:48.295385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:01:48.749239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:48.749192 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" probeResult="failure" output="Get \"https://10.133.0.34:6567/health\": dial tcp 10.133.0.34:6567: connect: connection refused" Feb 17 13:01:48.749414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:48.749351 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:01:51.993755 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:51.993731 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:01:52.039637 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039603 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c725099-ff99-4b9b-81dc-04319606d380" containerID="5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" exitCode=137 Feb 17 13:01:52.039637 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039630 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c725099-ff99-4b9b-81dc-04319606d380" containerID="a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" exitCode=137 Feb 17 13:01:52.039860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039671 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" Feb 17 13:01:52.039860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039691 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6"} Feb 17 13:01:52.039860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779"} Feb 17 13:01:52.039860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039759 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7" event={"ID":"4c725099-ff99-4b9b-81dc-04319606d380","Type":"ContainerDied","Data":"b761c64be74ca9f97a0aaddde54a72848cc8c0d8203c0c35241af0fca80b22bb"} Feb 17 13:01:52.039860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.039784 2573 scope.go:117] "RemoveContainer" containerID="8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" Feb 17 13:01:52.048987 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.048967 2573 scope.go:117] "RemoveContainer" containerID="5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" Feb 17 13:01:52.056593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.056568 2573 scope.go:117] "RemoveContainer" containerID="2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" Feb 17 13:01:52.063736 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.063720 2573 scope.go:117] "RemoveContainer" containerID="a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" Feb 17 13:01:52.071077 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.071056 2573 scope.go:117] "RemoveContainer" containerID="1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7" Feb 17 13:01:52.086556 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.086534 2573 scope.go:117] "RemoveContainer" containerID="8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" Feb 17 13:01:52.086839 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:52.086822 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18\": container with ID starting with 8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18 not found: ID does not exist" containerID="8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" Feb 17 13:01:52.086895 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.086848 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18"} err="failed to get container status \"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18\": rpc error: code = NotFound desc = could not find container \"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18\": container with ID starting with 8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18 not found: ID does not exist" Feb 17 13:01:52.086895 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.086866 2573 scope.go:117] "RemoveContainer" containerID="5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" Feb 17 13:01:52.087146 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:52.087123 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6\": container with ID starting with 5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6 not found: ID does not exist" containerID="5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" Feb 17 13:01:52.087205 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087152 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6"} err="failed to get container status \"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6\": rpc error: code = NotFound desc = could not find container \"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6\": container with ID starting with 5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6 not found: ID does not exist" Feb 17 13:01:52.087205 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087167 2573 scope.go:117] "RemoveContainer" containerID="2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" Feb 17 13:01:52.087440 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:52.087421 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9\": container with ID starting with 2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9 not found: ID does not exist" containerID="2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" Feb 17 13:01:52.087490 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087447 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9"} err="failed to get container status \"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9\": rpc error: code = NotFound desc = could not find container \"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9\": container with ID starting with 2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9 not found: ID does not exist" Feb 17 13:01:52.087490 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087466 2573 scope.go:117] "RemoveContainer" containerID="a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" Feb 17 13:01:52.087676 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:52.087662 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779\": container with ID starting with a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779 not found: ID does not exist" containerID="a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" Feb 17 13:01:52.087964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087679 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779"} err="failed to get container status \"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779\": rpc error: code = NotFound desc = could not find container \"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779\": container with ID starting with a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779 not found: ID does not exist" Feb 17 13:01:52.087964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087691 2573 scope.go:117] "RemoveContainer" containerID="1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7" Feb 17 13:01:52.087964 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:01:52.087886 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7\": container with ID starting with 1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7 not found: ID does not exist" containerID="1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7" Feb 17 13:01:52.087964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087901 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7"} err="failed to get container status \"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7\": rpc error: code = NotFound desc = could not find container \"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7\": container with ID starting with 1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7 not found: ID does not exist" Feb 17 13:01:52.087964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.087912 2573 scope.go:117] "RemoveContainer" containerID="8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18" Feb 17 13:01:52.088285 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088138 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18"} err="failed to get container status \"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18\": rpc error: code = NotFound desc = could not find container \"8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18\": container with ID starting with 8da5eb96ea2ee4f1c3fdb47e5c2691b40e1cb4ee296046179bdcadc67d156d18 not found: ID does not exist" Feb 17 13:01:52.088285 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088160 2573 scope.go:117] "RemoveContainer" containerID="5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6" Feb 17 13:01:52.088401 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088384 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6"} err="failed to get container status \"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6\": rpc error: code = NotFound desc = could not find container \"5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6\": container with ID starting with 5f4642d5aaaa554cb2b181d94b13469e94cc2728d436aa01162a84ab5994caf6 not found: ID does not exist" Feb 17 13:01:52.088443 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088402 2573 scope.go:117] "RemoveContainer" containerID="2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9" Feb 17 13:01:52.088590 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088576 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9"} err="failed to get container status \"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9\": rpc error: code = NotFound desc = could not find container \"2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9\": container with ID starting with 2b777499bb7c3c4629feef8b7b1b09dcc8513698720f4add1bd19403ba6588d9 not found: ID does not exist" Feb 17 13:01:52.088590 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088589 2573 scope.go:117] "RemoveContainer" containerID="a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779" Feb 17 13:01:52.088751 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088737 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779"} err="failed to get container status \"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779\": rpc error: code = NotFound desc = could not find container \"a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779\": container with ID starting with a579b5a93d22f6d8ecf1da89ca4e75273c669ad0b38974a7227513d35d94c779 not found: ID does not exist" Feb 17 13:01:52.088800 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088750 2573 scope.go:117] "RemoveContainer" containerID="1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7" Feb 17 13:01:52.088921 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.088904 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7"} err="failed to get container status \"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7\": rpc error: code = NotFound desc = could not find container \"1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7\": container with ID starting with 1ffd5beb8eb5adbfe3644af010734b6895cfdaaaad40e00a7701814a5dd2a8b7 not found: ID does not exist" Feb 17 13:01:52.107264 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107240 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107357 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107305 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107357 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107335 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4vfj\" (UniqueName: \"kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107357 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107353 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107545 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107394 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107545 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107451 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") pod \"4c725099-ff99-4b9b-81dc-04319606d380\" (UID: \"4c725099-ff99-4b9b-81dc-04319606d380\") " Feb 17 13:01:52.107903 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.107876 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data" (OuterVolumeSpecName: "feast-data") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:01:52.109700 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.109678 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:01:52.109871 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.109847 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:01:52.109950 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.109875 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:01:52.109950 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.109872 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj" (OuterVolumeSpecName: "kube-api-access-s4vfj") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "kube-api-access-s4vfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:01:52.109950 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.109896 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls" (OuterVolumeSpecName: "online-tls") pod "4c725099-ff99-4b9b-81dc-04319606d380" (UID: "4c725099-ff99-4b9b-81dc-04319606d380"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:01:52.208239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208203 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.208239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208234 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4vfj\" (UniqueName: \"kubernetes.io/projected/4c725099-ff99-4b9b-81dc-04319606d380-kube-api-access-s4vfj\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.208239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208245 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/4c725099-ff99-4b9b-81dc-04319606d380-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.208477 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208256 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.208477 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208265 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.208477 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.208273 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/4c725099-ff99-4b9b-81dc-04319606d380-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:01:52.360793 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.360763 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 13:01:52.364663 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.364367 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5p2q7"] Feb 17 13:01:52.934233 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:01:52.934202 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c725099-ff99-4b9b-81dc-04319606d380" path="/var/lib/kubelet/pods/4c725099-ff99-4b9b-81dc-04319606d380/volumes" Feb 17 13:02:33.327636 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.327551 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:02:33.328193 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328098 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="offline" Feb 17 13:02:33.328193 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328149 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="offline" Feb 17 13:02:33.328193 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328164 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" Feb 17 13:02:33.328193 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328174 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" Feb 17 13:02:33.328193 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328189 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328197 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328214 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="registry" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328223 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="registry" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328246 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="feast-init" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328255 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="feast-init" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328340 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="ui" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328354 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="offline" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328365 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="registry" Feb 17 13:02:33.328473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.328375 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c725099-ff99-4b9b-81dc-04319606d380" containerName="online" Feb 17 13:02:33.331639 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.331617 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.334310 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.334288 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-online-tls\"" Feb 17 13:02:33.334459 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.334437 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-offline-tls\"" Feb 17 13:02:33.334593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.334520 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"openshift-service-ca.crt\"" Feb 17 13:02:33.334877 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.334856 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-client-ca\"" Feb 17 13:02:33.335755 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.335735 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-ui-tls\"" Feb 17 13:02:33.335870 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.335848 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"kube-root-ca.crt\"" Feb 17 13:02:33.335981 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.335957 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-dockercfg-w7gg6\"" Feb 17 13:02:33.341764 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.341741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:02:33.470649 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.470649 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.470878 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.470878 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.470878 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.470878 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.470800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.571976 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.571937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.571976 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.571979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572247 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:02:33.572090 2573 secret.go:189] Couldn't get secret test-ns-remote-registry/feast-simple-feast-remote-setup-ui-tls: secret "feast-simple-feast-remote-setup-ui-tls" not found Feb 17 13:02:33.572247 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572247 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:02:33.572202 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls podName:7cd33478-287f-4b77-9e7e-047be0c2fbba nodeName:}" failed. No retries permitted until 2026-02-17 13:02:34.072178909 +0000 UTC m=+977.666993729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls") pod "feast-simple-feast-remote-setup-fbb6bb857-s4j46" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba") : secret "feast-simple-feast-remote-setup-ui-tls" not found Feb 17 13:02:33.572400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572400 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.572988 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.572968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.574548 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.574526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.574668 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.574643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:33.580633 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:33.580573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:34.077390 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:34.077347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:34.079774 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:34.079748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-s4j46\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:34.243423 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:34.243392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:34.365542 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:34.365518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:02:34.367493 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:02:34.367464 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd33478_287f_4b77_9e7e_047be0c2fbba.slice/crio-c1be09818af88dcc6787d9da60a57daf7bd68a163e7545b4ee68773928d6b5c4 WatchSource:0}: Error finding container c1be09818af88dcc6787d9da60a57daf7bd68a163e7545b4ee68773928d6b5c4: Status 404 returned error can't find the container with id c1be09818af88dcc6787d9da60a57daf7bd68a163e7545b4ee68773928d6b5c4 Feb 17 13:02:35.167415 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:35.167380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerStarted","Data":"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c"} Feb 17 13:02:35.167415 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:35.167422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerStarted","Data":"c1be09818af88dcc6787d9da60a57daf7bd68a163e7545b4ee68773928d6b5c4"} Feb 17 13:02:38.177782 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:38.177750 2573 generic.go:358] "Generic (PLEG): container finished" podID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerID="e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c" exitCode=0 Feb 17 13:02:38.178194 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:38.177818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerDied","Data":"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c"} Feb 17 13:02:39.184413 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:39.184378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerStarted","Data":"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628"} Feb 17 13:02:39.184413 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:39.184420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerStarted","Data":"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e"} Feb 17 13:02:39.185050 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:39.184430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerStarted","Data":"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3"} Feb 17 13:02:39.205369 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:39.205311 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podStartSLOduration=6.205290428 podStartE2EDuration="6.205290428s" podCreationTimestamp="2026-02-17 13:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:02:39.203765485 +0000 UTC m=+982.798580311" watchObservedRunningTime="2026-02-17 13:02:39.205290428 +0000 UTC m=+982.800105253" Feb 17 13:02:40.244101 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.244050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:40.244101 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.244103 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:40.244650 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.244133 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:40.246038 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.246001 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.36:8816: connect: connection refused" Feb 17 13:02:40.246220 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.246066 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" probeResult="failure" output="Get \"https://10.133.0.36:6567/health\": dial tcp 10.133.0.36:6567: connect: connection refused" Feb 17 13:02:40.246220 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:40.246001 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.36:8443: connect: connection refused" Feb 17 13:02:43.244670 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.244640 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.245030 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.244915 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.245163 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.245142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.245259 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.245174 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.245436 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.245419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.245528 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.245514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:43.249054 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:43.249037 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:44.199727 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:44.199696 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:02:44.202665 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:02:44.202640 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:03:46.686972 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.686933 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:03:46.687532 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.687296 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" containerID="cri-o://7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628" gracePeriod=30 Feb 17 13:03:46.687532 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.687341 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="offline" containerID="cri-o://532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e" gracePeriod=30 Feb 17 13:03:46.687532 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.687482 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" containerID="cri-o://455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3" gracePeriod=30 Feb 17 13:03:46.824841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.824810 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:03:46.825195 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.825143 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="registry" containerID="cri-o://5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" gracePeriod=30 Feb 17 13:03:46.825381 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.825200 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" containerID="cri-o://54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" gracePeriod=30 Feb 17 13:03:46.825471 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.825227 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" containerID="cri-o://5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" gracePeriod=30 Feb 17 13:03:46.825533 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:46.825218 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="offline" containerID="cri-o://87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" gracePeriod=30 Feb 17 13:03:47.388278 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.388249 2573 generic.go:358] "Generic (PLEG): container finished" podID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerID="7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628" exitCode=0 Feb 17 13:03:47.388278 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.388275 2573 generic.go:358] "Generic (PLEG): container finished" podID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerID="455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3" exitCode=0 Feb 17 13:03:47.388456 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.388319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerDied","Data":"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628"} Feb 17 13:03:47.388456 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.388362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerDied","Data":"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3"} Feb 17 13:03:47.391866 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.391818 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerID="54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" exitCode=0 Feb 17 13:03:47.391989 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:47.391938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336"} Feb 17 13:03:48.397318 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:48.397285 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerID="5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" exitCode=0 Feb 17 13:03:48.397698 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:48.397355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22"} Feb 17 13:03:51.507092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:51.507043 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.35:8443: connect: connection refused" Feb 17 13:03:51.978635 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:51.978592 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" probeResult="failure" output="Get \"https://10.133.0.35:6567/health\": dial tcp 10.133.0.35:6567: connect: connection refused" Feb 17 13:03:53.246063 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:53.246019 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.36:8443: connect: connection refused" Feb 17 13:03:54.200068 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:03:54.200022 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" probeResult="failure" output="Get \"https://10.133.0.36:6567/health\": dial tcp 10.133.0.36:6567: connect: connection refused" Feb 17 13:04:01.506950 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:01.506850 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.35:8443: connect: connection refused" Feb 17 13:04:01.977903 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:01.977862 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" probeResult="failure" output="Get \"https://10.133.0.35:6567/health\": dial tcp 10.133.0.35:6567: connect: connection refused" Feb 17 13:04:03.245171 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:03.245101 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.36:8443: connect: connection refused" Feb 17 13:04:04.199978 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:04.199940 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" probeResult="failure" output="Get \"https://10.133.0.36:6567/health\": dial tcp 10.133.0.36:6567: connect: connection refused" Feb 17 13:04:11.507552 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:11.507504 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.35:8443: connect: connection refused" Feb 17 13:04:11.508067 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:11.507641 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:04:11.978333 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:11.978291 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" probeResult="failure" output="Get \"https://10.133.0.35:6567/health\": dial tcp 10.133.0.35:6567: connect: connection refused" Feb 17 13:04:11.978506 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:11.978413 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:04:13.245313 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:13.245258 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.36:8443: connect: connection refused" Feb 17 13:04:13.245720 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:13.245414 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:04:14.200861 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:14.200803 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" probeResult="failure" output="Get \"https://10.133.0.36:6567/health\": dial tcp 10.133.0.36:6567: connect: connection refused" Feb 17 13:04:14.201053 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:14.200932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:04:17.333599 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.333571 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:04:17.453469 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.453412 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:04:17.482876 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.482847 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483059 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.482966 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483059 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.482991 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483059 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.483039 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483254 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.483070 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483254 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.483146 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g\") pod \"7cd33478-287f-4b77-9e7e-047be0c2fbba\" (UID: \"7cd33478-287f-4b77-9e7e-047be0c2fbba\") " Feb 17 13:04:17.483688 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.483426 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 13:04:17.483688 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.483583 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data" (OuterVolumeSpecName: "feast-data") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:04:17.485765 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.485725 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.485982 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.485956 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls" (OuterVolumeSpecName: "online-tls") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.485982 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.485970 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g" (OuterVolumeSpecName: "kube-api-access-vdk8g") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "kube-api-access-vdk8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:04:17.486128 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.485990 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "7cd33478-287f-4b77-9e7e-047be0c2fbba" (UID: "7cd33478-287f-4b77-9e7e-047be0c2fbba"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.488927 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.488900 2573 generic.go:358] "Generic (PLEG): container finished" podID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerID="532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e" exitCode=137 Feb 17 13:04:17.489015 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.488982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" Feb 17 13:04:17.489150 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.488979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerDied","Data":"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e"} Feb 17 13:04:17.489223 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.489155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46" event={"ID":"7cd33478-287f-4b77-9e7e-047be0c2fbba","Type":"ContainerDied","Data":"c1be09818af88dcc6787d9da60a57daf7bd68a163e7545b4ee68773928d6b5c4"} Feb 17 13:04:17.489223 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.489180 2573 scope.go:117] "RemoveContainer" containerID="7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628" Feb 17 13:04:17.491437 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491414 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerID="87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" exitCode=137 Feb 17 13:04:17.491437 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491437 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerID="5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" exitCode=137 Feb 17 13:04:17.491568 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491498 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" Feb 17 13:04:17.491568 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019"} Feb 17 13:04:17.491568 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491532 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede"} Feb 17 13:04:17.491568 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.491544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg" event={"ID":"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e","Type":"ContainerDied","Data":"310003a21069de67e3cffeb95314eaec359931ecf02d04ebcf666204e9721d17"} Feb 17 13:04:17.499147 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.499040 2573 scope.go:117] "RemoveContainer" containerID="532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e" Feb 17 13:04:17.508774 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.508754 2573 scope.go:117] "RemoveContainer" containerID="455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3" Feb 17 13:04:17.513333 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.513313 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:04:17.515726 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.515697 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-s4j46"] Feb 17 13:04:17.517556 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.517540 2573 scope.go:117] "RemoveContainer" containerID="e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c" Feb 17 13:04:17.528815 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.528800 2573 scope.go:117] "RemoveContainer" containerID="7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628" Feb 17 13:04:17.529052 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.529035 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628\": container with ID starting with 7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628 not found: ID does not exist" containerID="7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628" Feb 17 13:04:17.529139 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529061 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628"} err="failed to get container status \"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628\": rpc error: code = NotFound desc = could not find container \"7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628\": container with ID starting with 7c2d520f5a2f7eeb97d3431a9b7b7d01d3b7cfc4683b2c2e03753bff4762e628 not found: ID does not exist" Feb 17 13:04:17.529139 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529087 2573 scope.go:117] "RemoveContainer" containerID="532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e" Feb 17 13:04:17.529460 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.529442 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e\": container with ID starting with 532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e not found: ID does not exist" containerID="532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e" Feb 17 13:04:17.529504 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529464 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e"} err="failed to get container status \"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e\": rpc error: code = NotFound desc = could not find container \"532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e\": container with ID starting with 532a11209547f24dc82004fffe1fcd2706c22a222b48b52aa8510cefe990d48e not found: ID does not exist" Feb 17 13:04:17.529504 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529480 2573 scope.go:117] "RemoveContainer" containerID="455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3" Feb 17 13:04:17.529684 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.529668 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3\": container with ID starting with 455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3 not found: ID does not exist" containerID="455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3" Feb 17 13:04:17.529728 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529687 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3"} err="failed to get container status \"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3\": rpc error: code = NotFound desc = could not find container \"455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3\": container with ID starting with 455124791d07f7c9aa5e020e096064a23f8965f97b03b9d0ac1f5a6e88fa0ec3 not found: ID does not exist" Feb 17 13:04:17.529728 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529701 2573 scope.go:117] "RemoveContainer" containerID="e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c" Feb 17 13:04:17.529903 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.529886 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c\": container with ID starting with e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c not found: ID does not exist" containerID="e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c" Feb 17 13:04:17.529944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529907 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c"} err="failed to get container status \"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c\": rpc error: code = NotFound desc = could not find container \"e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c\": container with ID starting with e502b0b1e3e92c5944573bac3fa8b22c3f82b75db94bacaf8de6655552ffef8c not found: ID does not exist" Feb 17 13:04:17.529944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.529919 2573 scope.go:117] "RemoveContainer" containerID="54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" Feb 17 13:04:17.537406 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.537391 2573 scope.go:117] "RemoveContainer" containerID="87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" Feb 17 13:04:17.544257 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.544240 2573 scope.go:117] "RemoveContainer" containerID="5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" Feb 17 13:04:17.551310 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.551289 2573 scope.go:117] "RemoveContainer" containerID="5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" Feb 17 13:04:17.558581 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.558562 2573 scope.go:117] "RemoveContainer" containerID="32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e" Feb 17 13:04:17.569710 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.569681 2573 scope.go:117] "RemoveContainer" containerID="54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" Feb 17 13:04:17.569985 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.569967 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336\": container with ID starting with 54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336 not found: ID does not exist" containerID="54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" Feb 17 13:04:17.570032 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.569993 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336"} err="failed to get container status \"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336\": rpc error: code = NotFound desc = could not find container \"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336\": container with ID starting with 54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336 not found: ID does not exist" Feb 17 13:04:17.570032 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570013 2573 scope.go:117] "RemoveContainer" containerID="87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" Feb 17 13:04:17.570283 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.570266 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019\": container with ID starting with 87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019 not found: ID does not exist" containerID="87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" Feb 17 13:04:17.570327 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570299 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019"} err="failed to get container status \"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019\": rpc error: code = NotFound desc = could not find container \"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019\": container with ID starting with 87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019 not found: ID does not exist" Feb 17 13:04:17.570327 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570317 2573 scope.go:117] "RemoveContainer" containerID="5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" Feb 17 13:04:17.570523 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.570509 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22\": container with ID starting with 5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22 not found: ID does not exist" containerID="5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" Feb 17 13:04:17.570559 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570536 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22"} err="failed to get container status \"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22\": rpc error: code = NotFound desc = could not find container \"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22\": container with ID starting with 5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22 not found: ID does not exist" Feb 17 13:04:17.570559 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570549 2573 scope.go:117] "RemoveContainer" containerID="5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" Feb 17 13:04:17.570783 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.570756 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede\": container with ID starting with 5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede not found: ID does not exist" containerID="5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" Feb 17 13:04:17.570863 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570781 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede"} err="failed to get container status \"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede\": rpc error: code = NotFound desc = could not find container \"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede\": container with ID starting with 5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede not found: ID does not exist" Feb 17 13:04:17.570863 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.570796 2573 scope.go:117] "RemoveContainer" containerID="32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e" Feb 17 13:04:17.571018 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:17.571002 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e\": container with ID starting with 32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e not found: ID does not exist" containerID="32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e" Feb 17 13:04:17.571059 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571021 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e"} err="failed to get container status \"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e\": rpc error: code = NotFound desc = could not find container \"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e\": container with ID starting with 32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e not found: ID does not exist" Feb 17 13:04:17.571059 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571032 2573 scope.go:117] "RemoveContainer" containerID="54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336" Feb 17 13:04:17.571312 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571290 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336"} err="failed to get container status \"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336\": rpc error: code = NotFound desc = could not find container \"54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336\": container with ID starting with 54d997827f7bfea1d1ef3cea2a7d306630c1d698706c91a995e811dd61d2f336 not found: ID does not exist" Feb 17 13:04:17.571390 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571312 2573 scope.go:117] "RemoveContainer" containerID="87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019" Feb 17 13:04:17.571507 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571488 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019"} err="failed to get container status \"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019\": rpc error: code = NotFound desc = could not find container \"87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019\": container with ID starting with 87d390c8461a1d1565fd8b24188cdd05e5dd4b201b6f8c2fa5d71ed062ffb019 not found: ID does not exist" Feb 17 13:04:17.571545 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571508 2573 scope.go:117] "RemoveContainer" containerID="5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22" Feb 17 13:04:17.571704 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571687 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22"} err="failed to get container status \"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22\": rpc error: code = NotFound desc = could not find container \"5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22\": container with ID starting with 5a994cd2c475e75e507de8025cf3112ae1210faa20f0ddacaaa2883c0cdecd22 not found: ID does not exist" Feb 17 13:04:17.571744 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571705 2573 scope.go:117] "RemoveContainer" containerID="5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede" Feb 17 13:04:17.571907 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571885 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede"} err="failed to get container status \"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede\": rpc error: code = NotFound desc = could not find container \"5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede\": container with ID starting with 5bc72be60c2d7e673df11c85cceaa58cfd77e3900d4316a974bb6d52df6e1ede not found: ID does not exist" Feb 17 13:04:17.571986 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.571910 2573 scope.go:117] "RemoveContainer" containerID="32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e" Feb 17 13:04:17.572183 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.572166 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e"} err="failed to get container status \"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e\": rpc error: code = NotFound desc = could not find container \"32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e\": container with ID starting with 32bebd0922cb06e65ee31a129f657ccd474900dc23b22ecfe716e1807d880e8e not found: ID does not exist" Feb 17 13:04:17.584474 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584456 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584575 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584505 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584575 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584543 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp4m2\" (UniqueName: \"kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584691 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584578 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584691 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584599 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584691 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584626 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data\") pod \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\" (UID: \"5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e\") " Feb 17 13:04:17.584943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584885 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/7cd33478-287f-4b77-9e7e-047be0c2fbba-kube-api-access-vdk8g\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.584943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584907 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/7cd33478-287f-4b77-9e7e-047be0c2fbba-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.584943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584922 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.584943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584935 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/7cd33478-287f-4b77-9e7e-047be0c2fbba-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.584943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584947 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.585258 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.584959 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/7cd33478-287f-4b77-9e7e-047be0c2fbba-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.585359 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.585335 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data" (OuterVolumeSpecName: "feast-data") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:04:17.586628 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.586608 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls" (OuterVolumeSpecName: "online-tls") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.587092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.587072 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2" (OuterVolumeSpecName: "kube-api-access-rp4m2") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "kube-api-access-rp4m2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:04:17.587092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.587079 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.587208 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.587094 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.587208 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.587124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" (UID: "5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:04:17.686414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686373 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rp4m2\" (UniqueName: \"kubernetes.io/projected/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-kube-api-access-rp4m2\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.686414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686405 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.686414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686417 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.686414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686426 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.686681 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686435 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.686681 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.686443 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:17.813587 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.813558 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:04:17.816008 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:17.815986 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-tp5bg"] Feb 17 13:04:18.933740 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:18.933706 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" path="/var/lib/kubelet/pods/5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e/volumes" Feb 17 13:04:18.934296 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:18.934281 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" path="/var/lib/kubelet/pods/7cd33478-287f-4b77-9e7e-047be0c2fbba/volumes" Feb 17 13:04:28.604446 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.604408 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 13:04:28.604838 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.604669 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" podUID="0dd8a7cf-bc8a-4865-b648-f720263400c5" containerName="manager" containerID="cri-o://3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12" gracePeriod=10 Feb 17 13:04:28.844700 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.844677 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 13:04:28.879991 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.879910 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vkpb\" (UniqueName: \"kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb\") pod \"0dd8a7cf-bc8a-4865-b648-f720263400c5\" (UID: \"0dd8a7cf-bc8a-4865-b648-f720263400c5\") " Feb 17 13:04:28.882066 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.882041 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb" (OuterVolumeSpecName: "kube-api-access-4vkpb") pod "0dd8a7cf-bc8a-4865-b648-f720263400c5" (UID: "0dd8a7cf-bc8a-4865-b648-f720263400c5"). InnerVolumeSpecName "kube-api-access-4vkpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:04:28.980589 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:28.980548 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vkpb\" (UniqueName: \"kubernetes.io/projected/0dd8a7cf-bc8a-4865-b648-f720263400c5-kube-api-access-4vkpb\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:04:29.527414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.527380 2573 generic.go:358] "Generic (PLEG): container finished" podID="0dd8a7cf-bc8a-4865-b648-f720263400c5" containerID="3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12" exitCode=0 Feb 17 13:04:29.527591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.527441 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" Feb 17 13:04:29.527591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.527461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" event={"ID":"0dd8a7cf-bc8a-4865-b648-f720263400c5","Type":"ContainerDied","Data":"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12"} Feb 17 13:04:29.527591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.527501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6" event={"ID":"0dd8a7cf-bc8a-4865-b648-f720263400c5","Type":"ContainerDied","Data":"3a58093024be584389abf5a20f005278585690df36c448265d698270d8826ff1"} Feb 17 13:04:29.527591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.527520 2573 scope.go:117] "RemoveContainer" containerID="3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12" Feb 17 13:04:29.535643 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.535626 2573 scope.go:117] "RemoveContainer" containerID="3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12" Feb 17 13:04:29.535889 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:29.535872 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12\": container with ID starting with 3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12 not found: ID does not exist" containerID="3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12" Feb 17 13:04:29.535940 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.535897 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12"} err="failed to get container status \"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12\": rpc error: code = NotFound desc = could not find container \"3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12\": container with ID starting with 3476fb8ff2ffc1c30617c743d70c2800e3f659e24b0e3be547b89ba350e80c12 not found: ID does not exist" Feb 17 13:04:29.549512 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.549468 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 13:04:29.550888 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:29.550864 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-n68c6"] Feb 17 13:04:30.932836 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:30.932807 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd8a7cf-bc8a-4865-b648-f720263400c5" path="/var/lib/kubelet/pods/0dd8a7cf-bc8a-4865-b648-f720263400c5/volumes" Feb 17 13:04:32.574953 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.574900 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz"] Feb 17 13:04:32.575429 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575393 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" Feb 17 13:04:32.575429 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575414 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575430 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575439 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575453 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="registry" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575463 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="registry" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575482 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575491 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575503 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="offline" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575511 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="offline" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575524 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575532 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575542 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="feast-init" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575550 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="feast-init" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575561 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dd8a7cf-bc8a-4865-b648-f720263400c5" containerName="manager" Feb 17 13:04:32.575570 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575569 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd8a7cf-bc8a-4865-b648-f720263400c5" containerName="manager" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575580 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="offline" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575588 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="offline" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575604 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="feast-init" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575632 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="feast-init" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575710 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="online" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575725 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="offline" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575735 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dd8a7cf-bc8a-4865-b648-f720263400c5" containerName="manager" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575746 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="registry" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575756 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f341b4d-1c1e-4ae7-a80c-8f66a5eb896e" containerName="ui" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575767 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="online" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575776 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="offline" Feb 17 13:04:32.576239 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.575787 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cd33478-287f-4b77-9e7e-047be0c2fbba" containerName="ui" Feb 17 13:04:32.580487 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.580466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:32.583543 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.583311 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"kube-root-ca.crt\"" Feb 17 13:04:32.584229 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.584207 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"feast-operator-system\"/\"openshift-service-ca.crt\"" Feb 17 13:04:32.584323 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.584288 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"feast-operator-system\"/\"feast-operator-controller-manager-dockercfg-qqnqs\"" Feb 17 13:04:32.585247 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.585227 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz"] Feb 17 13:04:32.608712 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.608685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp7pc\" (UniqueName: \"kubernetes.io/projected/a5dca989-955c-441a-8245-0e3cc5b84aea-kube-api-access-hp7pc\") pod \"feast-operator-controller-manager-6984f6c56-tcxgz\" (UID: \"a5dca989-955c-441a-8245-0e3cc5b84aea\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:32.709705 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.709668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp7pc\" (UniqueName: \"kubernetes.io/projected/a5dca989-955c-441a-8245-0e3cc5b84aea-kube-api-access-hp7pc\") pod \"feast-operator-controller-manager-6984f6c56-tcxgz\" (UID: \"a5dca989-955c-441a-8245-0e3cc5b84aea\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:32.718218 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.718188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp7pc\" (UniqueName: \"kubernetes.io/projected/a5dca989-955c-441a-8245-0e3cc5b84aea-kube-api-access-hp7pc\") pod \"feast-operator-controller-manager-6984f6c56-tcxgz\" (UID: \"a5dca989-955c-441a-8245-0e3cc5b84aea\") " pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:32.891989 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:32.891882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:33.014091 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.014057 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz"] Feb 17 13:04:33.016286 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:04:33.016257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5dca989_955c_441a_8245_0e3cc5b84aea.slice/crio-b358906bc0a3c8619572b8fe22f5faf3f7ce32dab4cbe5105ed391b09c9fc4f5 WatchSource:0}: Error finding container b358906bc0a3c8619572b8fe22f5faf3f7ce32dab4cbe5105ed391b09c9fc4f5: Status 404 returned error can't find the container with id b358906bc0a3c8619572b8fe22f5faf3f7ce32dab4cbe5105ed391b09c9fc4f5 Feb 17 13:04:33.017964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.017946 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:04:33.541882 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.541843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" event={"ID":"a5dca989-955c-441a-8245-0e3cc5b84aea","Type":"ContainerStarted","Data":"aed9801a0e1a655e5186badf99b265eb9a6ab891ca70e81413517193a312859c"} Feb 17 13:04:33.541882 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.541885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" event={"ID":"a5dca989-955c-441a-8245-0e3cc5b84aea","Type":"ContainerStarted","Data":"b358906bc0a3c8619572b8fe22f5faf3f7ce32dab4cbe5105ed391b09c9fc4f5"} Feb 17 13:04:33.542085 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.541988 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:33.557307 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:33.557257 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" podStartSLOduration=1.5572432379999999 podStartE2EDuration="1.557243238s" podCreationTimestamp="2026-02-17 13:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:04:33.55685571 +0000 UTC m=+1097.151670536" watchObservedRunningTime="2026-02-17 13:04:33.557243238 +0000 UTC m=+1097.152058062" Feb 17 13:04:44.547552 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:44.547524 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="feast-operator-system/feast-operator-controller-manager-6984f6c56-tcxgz" Feb 17 13:04:50.796221 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.796184 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:04:50.800260 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.800241 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.803314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.802911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-ui-tls\"" Feb 17 13:04:50.803314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.802929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-v2qss\"" Feb 17 13:04:50.803314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.803010 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-registry-tls\"" Feb 17 13:04:50.803314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.803185 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"openshift-service-ca.crt\"" Feb 17 13:04:50.803314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.803209 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-offline-tls\"" Feb 17 13:04:50.803624 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.803443 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-online-tls\"" Feb 17 13:04:50.803838 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.803817 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-feast\"/\"kube-root-ca.crt\"" Feb 17 13:04:50.811378 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.811358 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:04:50.845768 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.845925 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.845925 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.845925 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.846099 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.846099 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.845944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcd54\" (UniqueName: \"kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946567 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcd54\" (UniqueName: \"kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.946737 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946694 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-online-tls: secret "feast-simple-feast-setup-online-tls" not found Feb 17 13:04:50.946975 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946746 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-offline-tls: secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:04:50.946975 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls podName:032d25f2-df66-47d3-a267-b13f2bacdbfa nodeName:}" failed. No retries permitted until 2026-02-17 13:04:51.446752346 +0000 UTC m=+1115.041567154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls") pod "feast-simple-feast-setup-565c46b746-5fbhw" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa") : secret "feast-simple-feast-setup-online-tls" not found Feb 17 13:04:50.946975 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls podName:032d25f2-df66-47d3-a267-b13f2bacdbfa nodeName:}" failed. No retries permitted until 2026-02-17 13:04:51.446783354 +0000 UTC m=+1115.041598163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls") pod "feast-simple-feast-setup-565c46b746-5fbhw" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa") : secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:04:50.946975 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946857 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-ui-tls: secret "feast-simple-feast-setup-ui-tls" not found Feb 17 13:04:50.946975 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:04:50.946887 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls podName:032d25f2-df66-47d3-a267-b13f2bacdbfa nodeName:}" failed. No retries permitted until 2026-02-17 13:04:51.446877302 +0000 UTC m=+1115.041692112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ui-tls" (UniqueName: "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls") pod "feast-simple-feast-setup-565c46b746-5fbhw" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa") : secret "feast-simple-feast-setup-ui-tls" not found Feb 17 13:04:50.947178 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.946976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.949543 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.949522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:50.956516 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:50.956494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcd54\" (UniqueName: \"kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.452229 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.452188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.452444 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.452241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.452444 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.452333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.454841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.454815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.454992 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.454971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.455036 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.454971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") pod \"feast-simple-feast-setup-565c46b746-5fbhw\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.720732 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.720645 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:51.850352 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:51.850316 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:04:51.851773 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:04:51.851744 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032d25f2_df66_47d3_a267_b13f2bacdbfa.slice/crio-16c86120d5a2056e2ba005f77d7e3050896ee18482b5bfbd1f06b283223126aa WatchSource:0}: Error finding container 16c86120d5a2056e2ba005f77d7e3050896ee18482b5bfbd1f06b283223126aa: Status 404 returned error can't find the container with id 16c86120d5a2056e2ba005f77d7e3050896ee18482b5bfbd1f06b283223126aa Feb 17 13:04:52.608809 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:52.608769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"0396904dcacadb24ffb9d14cbc120b2a39f519e5d62d85b98be2c4ad1b116572"} Feb 17 13:04:52.608985 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:52.608818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"16c86120d5a2056e2ba005f77d7e3050896ee18482b5bfbd1f06b283223126aa"} Feb 17 13:04:55.620435 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:55.620403 2573 generic.go:358] "Generic (PLEG): container finished" podID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerID="0396904dcacadb24ffb9d14cbc120b2a39f519e5d62d85b98be2c4ad1b116572" exitCode=0 Feb 17 13:04:55.620920 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:55.620476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"0396904dcacadb24ffb9d14cbc120b2a39f519e5d62d85b98be2c4ad1b116572"} Feb 17 13:04:56.629726 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:56.629686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"295cd8b2fe0c0f3932ed7033454baf14da7c7570dda200c6b0bf3f4070296c11"} Feb 17 13:04:56.630284 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:56.629739 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"0575d628857107a61af84b365e392a49ae4dde4020cbc248f4854de8a30c38db"} Feb 17 13:04:56.630284 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:56.629758 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"495823bdf6cee46e238bb47630fdc86a92082a52d3cea3a0245a947dc451eac3"} Feb 17 13:04:56.630284 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:56.629772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerStarted","Data":"5efcb5afef510dcbbe5c161429e3702fb989ede245d7a613e801848e4bb6d870"} Feb 17 13:04:56.651662 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:56.651612 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podStartSLOduration=6.651592963 podStartE2EDuration="6.651592963s" podCreationTimestamp="2026-02-17 13:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:04:56.649890921 +0000 UTC m=+1120.244705750" watchObservedRunningTime="2026-02-17 13:04:56.651592963 +0000 UTC m=+1120.246407789" Feb 17 13:04:57.720767 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.720725 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:57.720767 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.720777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:57.721352 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.720791 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:57.721352 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.721195 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:04:57.722774 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.722739 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.38:8443: connect: connection refused" Feb 17 13:04:57.722892 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.722768 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.38:6571: connect: connection refused" Feb 17 13:04:57.722892 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.722800 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" probeResult="failure" output="Get \"https://10.133.0.38:6567/health\": dial tcp 10.133.0.38:6567: connect: connection refused" Feb 17 13:04:57.723096 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:04:57.723061 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.38:8816: connect: connection refused" Feb 17 13:05:00.722389 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.722352 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723186 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723161 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723323 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723312 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723675 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723647 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723796 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723694 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723796 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723708 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723895 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723828 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.723941 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.723927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.724031 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.724017 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:00.727416 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:00.727401 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:01.647255 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:01.647222 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:05:01.650231 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:05:01.650209 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:06:02.548955 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.548912 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:06:02.549450 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.549335 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="registry" containerID="cri-o://5efcb5afef510dcbbe5c161429e3702fb989ede245d7a613e801848e4bb6d870" gracePeriod=30 Feb 17 13:06:02.549528 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.549413 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="offline" containerID="cri-o://0575d628857107a61af84b365e392a49ae4dde4020cbc248f4854de8a30c38db" gracePeriod=30 Feb 17 13:06:02.549528 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.549496 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" containerID="cri-o://295cd8b2fe0c0f3932ed7033454baf14da7c7570dda200c6b0bf3f4070296c11" gracePeriod=30 Feb 17 13:06:02.549692 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.549421 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" containerID="cri-o://495823bdf6cee46e238bb47630fdc86a92082a52d3cea3a0245a947dc451eac3" gracePeriod=30 Feb 17 13:06:02.788616 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.788565 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:06:02.793592 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.793569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.797840 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.797809 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-feast\"/\"feast-simple-feast-setup-dockercfg-wpk5r\"" Feb 17 13:06:02.801206 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.801140 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:06:02.856774 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.856726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.856953 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.856785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.856953 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.856821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.857081 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.856953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl587\" (UniqueName: \"kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.857081 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.857038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.857081 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.857073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958690 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl587\" (UniqueName: \"kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.958944 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.958891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.959307 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:02.959018 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-online-tls: secret "feast-simple-feast-setup-online-tls" not found Feb 17 13:06:02.959307 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:02.959084 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls podName:e7529458-25c0-441b-a096-2d635eb40468 nodeName:}" failed. No retries permitted until 2026-02-17 13:06:03.459062135 +0000 UTC m=+1187.053876947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls") pod "feast-simple-feast-setup-565c46b746-czfbq" (UID: "e7529458-25c0-441b-a096-2d635eb40468") : secret "feast-simple-feast-setup-online-tls" not found Feb 17 13:06:02.959437 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:02.959361 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-offline-tls: secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:06:02.959437 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:02.959402 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls podName:e7529458-25c0-441b-a096-2d635eb40468 nodeName:}" failed. No retries permitted until 2026-02-17 13:06:03.459389379 +0000 UTC m=+1187.054204189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls") pod "feast-simple-feast-setup-565c46b746-czfbq" (UID: "e7529458-25c0-441b-a096-2d635eb40468") : secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:06:02.959804 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.959753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.964605 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.964572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.964876 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.964846 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:02.968501 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:02.968468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl587\" (UniqueName: \"kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:03.463840 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.463799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:03.464053 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.463862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:03.464053 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:03.463963 2573 secret.go:189] Couldn't get secret test-ns-feast/feast-simple-feast-setup-offline-tls: secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:06:03.464208 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:06:03.464059 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls podName:e7529458-25c0-441b-a096-2d635eb40468 nodeName:}" failed. No retries permitted until 2026-02-17 13:06:04.464034212 +0000 UTC m=+1188.058849022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls") pod "feast-simple-feast-setup-565c46b746-czfbq" (UID: "e7529458-25c0-441b-a096-2d635eb40468") : secret "feast-simple-feast-setup-offline-tls" not found Feb 17 13:06:03.466284 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.466258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:03.838826 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.838739 2573 generic.go:358] "Generic (PLEG): container finished" podID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerID="295cd8b2fe0c0f3932ed7033454baf14da7c7570dda200c6b0bf3f4070296c11" exitCode=0 Feb 17 13:06:03.838826 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.838764 2573 generic.go:358] "Generic (PLEG): container finished" podID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerID="495823bdf6cee46e238bb47630fdc86a92082a52d3cea3a0245a947dc451eac3" exitCode=0 Feb 17 13:06:03.839248 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.838815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"295cd8b2fe0c0f3932ed7033454baf14da7c7570dda200c6b0bf3f4070296c11"} Feb 17 13:06:03.839248 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:03.838853 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"495823bdf6cee46e238bb47630fdc86a92082a52d3cea3a0245a947dc451eac3"} Feb 17 13:06:04.473955 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.473918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:04.476311 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.476288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"feast-simple-feast-setup-565c46b746-czfbq\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:04.610971 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.610918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:04.735384 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.735355 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:06:04.738307 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:06:04.738283 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7529458_25c0_441b_a096_2d635eb40468.slice/crio-0e79b435d4965fa63c828e5bdb4d4e1245ec17c98415296c267776d0002775c4 WatchSource:0}: Error finding container 0e79b435d4965fa63c828e5bdb4d4e1245ec17c98415296c267776d0002775c4: Status 404 returned error can't find the container with id 0e79b435d4965fa63c828e5bdb4d4e1245ec17c98415296c267776d0002775c4 Feb 17 13:06:04.843473 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.843431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"c74cff4f86636e795147d255e8684e911a62a570dba0358e5c8aba56e611cc1c"} Feb 17 13:06:04.843844 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:04.843481 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"0e79b435d4965fa63c828e5bdb4d4e1245ec17c98415296c267776d0002775c4"} Feb 17 13:06:08.855712 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:08.855675 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7529458-25c0-441b-a096-2d635eb40468" containerID="c74cff4f86636e795147d255e8684e911a62a570dba0358e5c8aba56e611cc1c" exitCode=0 Feb 17 13:06:08.856183 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:08.855746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"c74cff4f86636e795147d255e8684e911a62a570dba0358e5c8aba56e611cc1c"} Feb 17 13:06:09.863042 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:09.863000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"a1e69d039c04df8cb5de21595639a4da6c9ac24c7ef862d5d77e8eb099179221"} Feb 17 13:06:09.863554 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:09.863053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"d03768321ab116e17d359f8eb929c45cfb8b9e07983a6a41ddf9efd814c8b943"} Feb 17 13:06:09.863554 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:09.863066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"c8a67fa19fc715902dda4b1773e68cb5465f0c96dbb40ea2888bc44115a56811"} Feb 17 13:06:09.863554 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:09.863080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerStarted","Data":"e5ec9e7f4eb2c06971a123b73e29feb975ec06470fc3a5e60038564090be12ff"} Feb 17 13:06:09.885745 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:09.885682 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podStartSLOduration=7.885660837 podStartE2EDuration="7.885660837s" podCreationTimestamp="2026-02-17 13:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:06:09.883059097 +0000 UTC m=+1193.477873936" watchObservedRunningTime="2026-02-17 13:06:09.885660837 +0000 UTC m=+1193.480475663" Feb 17 13:06:10.611389 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.611356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:10.611644 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.611630 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:10.611748 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.611738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:10.611833 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.611824 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:10.613564 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.613507 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.39:8816: connect: connection refused" Feb 17 13:06:10.613696 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.613608 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" probeResult="failure" output="Get \"https://10.133.0.39:6567/health\": dial tcp 10.133.0.39:6567: connect: connection refused" Feb 17 13:06:10.613696 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.613532 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="registry" probeResult="failure" output="dial tcp 10.133.0.39:6571: connect: connection refused" Feb 17 13:06:10.613696 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.613533 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.39:8443: connect: connection refused" Feb 17 13:06:10.724446 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:10.724403 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.38:8443: connect: connection refused" Feb 17 13:06:11.648136 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:11.648064 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" probeResult="failure" output="Get \"https://10.133.0.38:6567/health\": dial tcp 10.133.0.38:6567: connect: connection refused" Feb 17 13:06:13.612753 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.612719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.612810 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613393 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613368 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613461 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613461 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613418 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613763 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613740 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613864 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613798 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.613965 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.613950 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.617301 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.617281 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.876429 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.876344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:13.879397 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:13.879373 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:06:16.950003 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:16.949967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 13:06:16.950882 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:16.950858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 13:06:16.956791 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:16.956772 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 13:06:16.957514 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:16.957498 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 13:06:20.723915 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:20.723864 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.38:8443: connect: connection refused" Feb 17 13:06:21.648260 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:21.648215 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" probeResult="failure" output="Get \"https://10.133.0.38:6567/health\": dial tcp 10.133.0.38:6567: connect: connection refused" Feb 17 13:06:30.724519 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:30.724467 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.38:8443: connect: connection refused" Feb 17 13:06:30.725030 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:30.724615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:06:31.647873 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:31.647828 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" probeResult="failure" output="Get \"https://10.133.0.38:6567/health\": dial tcp 10.133.0.38:6567: connect: connection refused" Feb 17 13:06:31.648044 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:31.647951 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:06:32.936378 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:32.936344 2573 generic.go:358] "Generic (PLEG): container finished" podID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerID="0575d628857107a61af84b365e392a49ae4dde4020cbc248f4854de8a30c38db" exitCode=137 Feb 17 13:06:32.936378 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:32.936372 2573 generic.go:358] "Generic (PLEG): container finished" podID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerID="5efcb5afef510dcbbe5c161429e3702fb989ede245d7a613e801848e4bb6d870" exitCode=137 Feb 17 13:06:32.936783 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:32.936403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"0575d628857107a61af84b365e392a49ae4dde4020cbc248f4854de8a30c38db"} Feb 17 13:06:32.936783 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:32.936425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"5efcb5afef510dcbbe5c161429e3702fb989ede245d7a613e801848e4bb6d870"} Feb 17 13:06:33.192617 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.192561 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:06:33.244850 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.244819 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcd54\" (UniqueName: \"kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245022 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.244867 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245022 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.244887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245022 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.245000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.245055 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245314 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.245079 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls\") pod \"032d25f2-df66-47d3-a267-b13f2bacdbfa\" (UID: \"032d25f2-df66-47d3-a267-b13f2bacdbfa\") " Feb 17 13:06:33.245709 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.245673 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data" (OuterVolumeSpecName: "feast-data") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:06:33.247319 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.247296 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:06:33.247423 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.247321 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54" (OuterVolumeSpecName: "kube-api-access-rcd54") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "kube-api-access-rcd54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:06:33.247423 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.247314 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls" (OuterVolumeSpecName: "online-tls") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:06:33.247423 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.247375 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:06:33.247537 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.247491 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "032d25f2-df66-47d3-a267-b13f2bacdbfa" (UID: "032d25f2-df66-47d3-a267-b13f2bacdbfa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:06:33.346191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346156 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcd54\" (UniqueName: \"kubernetes.io/projected/032d25f2-df66-47d3-a267-b13f2bacdbfa-kube-api-access-rcd54\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.346191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346186 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.346191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346196 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.346414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346205 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.346414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346213 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/032d25f2-df66-47d3-a267-b13f2bacdbfa-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.346414 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.346221 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/032d25f2-df66-47d3-a267-b13f2bacdbfa-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:06:33.941474 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.941447 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" Feb 17 13:06:33.941474 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.941458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw" event={"ID":"032d25f2-df66-47d3-a267-b13f2bacdbfa","Type":"ContainerDied","Data":"16c86120d5a2056e2ba005f77d7e3050896ee18482b5bfbd1f06b283223126aa"} Feb 17 13:06:33.941939 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.941502 2573 scope.go:117] "RemoveContainer" containerID="295cd8b2fe0c0f3932ed7033454baf14da7c7570dda200c6b0bf3f4070296c11" Feb 17 13:06:33.950924 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.950909 2573 scope.go:117] "RemoveContainer" containerID="0575d628857107a61af84b365e392a49ae4dde4020cbc248f4854de8a30c38db" Feb 17 13:06:33.958747 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.958728 2573 scope.go:117] "RemoveContainer" containerID="495823bdf6cee46e238bb47630fdc86a92082a52d3cea3a0245a947dc451eac3" Feb 17 13:06:33.963047 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.963026 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:06:33.966683 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.966664 2573 scope.go:117] "RemoveContainer" containerID="5efcb5afef510dcbbe5c161429e3702fb989ede245d7a613e801848e4bb6d870" Feb 17 13:06:33.969734 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.969714 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-5fbhw"] Feb 17 13:06:33.974124 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:33.974090 2573 scope.go:117] "RemoveContainer" containerID="0396904dcacadb24ffb9d14cbc120b2a39f519e5d62d85b98be2c4ad1b116572" Feb 17 13:06:34.933757 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:06:34.933723 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" path="/var/lib/kubelet/pods/032d25f2-df66-47d3-a267-b13f2bacdbfa/volumes" Feb 17 13:07:16.059560 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.059455 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.059971 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.059991 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060010 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="offline" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060018 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="offline" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060035 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="registry" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060045 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="registry" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060056 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060064 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" Feb 17 13:07:16.060092 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060092 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="feast-init" Feb 17 13:07:16.060593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060100 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="feast-init" Feb 17 13:07:16.060593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060219 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="online" Feb 17 13:07:16.060593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060235 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="offline" Feb 17 13:07:16.060593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060244 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="registry" Feb 17 13:07:16.060593 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.060255 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="032d25f2-df66-47d3-a267-b13f2bacdbfa" containerName="ui" Feb 17 13:07:16.063943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.063921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.066667 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066642 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"openshift-service-ca.crt\"" Feb 17 13:07:16.066806 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-online-tls\"" Feb 17 13:07:16.066943 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-dockercfg-nkc2v\"" Feb 17 13:07:16.067011 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066960 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"kube-root-ca.crt\"" Feb 17 13:07:16.067011 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-offline-tls\"" Feb 17 13:07:16.067151 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.066742 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-ui-tls\"" Feb 17 13:07:16.067860 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.067839 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-client-ca\"" Feb 17 13:07:16.076692 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.076669 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:07:16.114134 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.114254 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.114306 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.114344 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.114393 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114352 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbz6j\" (UniqueName: \"kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.114393 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.114378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.214987 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.214957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbz6j\" (UniqueName: \"kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.214987 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.214994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215224 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215430 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:07:16.215260 2573 secret.go:189] Couldn't get secret test-ns-remote-registry/feast-simple-feast-remote-setup-offline-tls: secret "feast-simple-feast-remote-setup-offline-tls" not found Feb 17 13:07:16.215430 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:07:16.215263 2573 secret.go:189] Couldn't get secret test-ns-remote-registry/feast-simple-feast-remote-setup-online-tls: secret "feast-simple-feast-remote-setup-online-tls" not found Feb 17 13:07:16.215430 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:07:16.215361 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls podName:a7cad315-3a4c-4de8-8af4-dc988b205c25 nodeName:}" failed. No retries permitted until 2026-02-17 13:07:16.715315404 +0000 UTC m=+1260.310130226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "offline-tls" (UniqueName: "kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls") pod "feast-simple-feast-remote-setup-fbb6bb857-vsb76" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25") : secret "feast-simple-feast-remote-setup-offline-tls" not found Feb 17 13:07:16.215430 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:07:16.215378 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls podName:a7cad315-3a4c-4de8-8af4-dc988b205c25 nodeName:}" failed. No retries permitted until 2026-02-17 13:07:16.715369782 +0000 UTC m=+1260.310184585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "online-tls" (UniqueName: "kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls") pod "feast-simple-feast-remote-setup-fbb6bb857-vsb76" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25") : secret "feast-simple-feast-remote-setup-online-tls" not found Feb 17 13:07:16.215812 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.215812 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.215801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.217611 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.217582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.223843 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.223819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbz6j\" (UniqueName: \"kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.720580 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.720548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.720747 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.720649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.722901 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.722873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.723026 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.722907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") pod \"feast-simple-feast-remote-setup-fbb6bb857-vsb76\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:16.979156 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.979059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-remote-registry\"/\"feast-simple-feast-remote-setup-dockercfg-nkc2v\"" Feb 17 13:07:16.987777 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:16.987749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:17.111227 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:17.111195 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:07:17.115141 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:07:17.115099 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cad315_3a4c_4de8_8af4_dc988b205c25.slice/crio-98ffddaf7c85ff924eea6c6b3fb5fe3ac20810a1337de767cb2ecb846a3393a6 WatchSource:0}: Error finding container 98ffddaf7c85ff924eea6c6b3fb5fe3ac20810a1337de767cb2ecb846a3393a6: Status 404 returned error can't find the container with id 98ffddaf7c85ff924eea6c6b3fb5fe3ac20810a1337de767cb2ecb846a3393a6 Feb 17 13:07:18.080657 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:18.080623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerStarted","Data":"8afcd6da019246b57b7241ce18ef36348b8a05c5385c6af6a56fe0196582aa23"} Feb 17 13:07:18.080657 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:18.080662 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerStarted","Data":"98ffddaf7c85ff924eea6c6b3fb5fe3ac20810a1337de767cb2ecb846a3393a6"} Feb 17 13:07:21.093951 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:21.093918 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerID="8afcd6da019246b57b7241ce18ef36348b8a05c5385c6af6a56fe0196582aa23" exitCode=0 Feb 17 13:07:21.094353 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:21.093989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerDied","Data":"8afcd6da019246b57b7241ce18ef36348b8a05c5385c6af6a56fe0196582aa23"} Feb 17 13:07:22.100191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.100150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerStarted","Data":"248c8148fca455e0a95cfdd15776fa2841cd46dcda5778b816b8abe148440d18"} Feb 17 13:07:22.100191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.100205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerStarted","Data":"378beabb9490dd768cb9158277872170ea79e2bd6c4b3c4f94eefe7587eab971"} Feb 17 13:07:22.100751 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.100218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerStarted","Data":"318c9a8591ae4e498333cf1c20d772a414399ee2d8e248b561fce5666efb69c9"} Feb 17 13:07:22.121692 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.121637 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podStartSLOduration=6.121620803 podStartE2EDuration="6.121620803s" podCreationTimestamp="2026-02-17 13:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:07:22.119461122 +0000 UTC m=+1265.714275946" watchObservedRunningTime="2026-02-17 13:07:22.121620803 +0000 UTC m=+1265.716435628" Feb 17 13:07:22.988782 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.988732 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:22.988782 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.988786 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:22.989030 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.988801 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:22.990931 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.990884 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.40:8443: connect: connection refused" Feb 17 13:07:22.991085 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.990959 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" probeResult="failure" output="Get \"https://10.133.0.40:6567/health\": dial tcp 10.133.0.40:6567: connect: connection refused" Feb 17 13:07:22.991085 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:22.990884 2573 prober.go:120] "Probe failed" probeType="Startup" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="offline" probeResult="failure" output="dial tcp 10.133.0.40:8816: connect: connection refused" Feb 17 13:07:25.990020 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.989990 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.990591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.990102 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.990591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.990456 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.990591 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.990494 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.990795 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.990775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.990976 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.990955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:25.994500 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:25.994481 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:26.112836 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:26.112808 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:07:26.115831 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:07:26.115805 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:08:27.224501 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.224467 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:08:27.225014 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.224747 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" containerID="cri-o://318c9a8591ae4e498333cf1c20d772a414399ee2d8e248b561fce5666efb69c9" gracePeriod=30 Feb 17 13:08:27.225014 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.224801 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="offline" containerID="cri-o://378beabb9490dd768cb9158277872170ea79e2bd6c4b3c4f94eefe7587eab971" gracePeriod=30 Feb 17 13:08:27.225014 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.224943 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" containerID="cri-o://248c8148fca455e0a95cfdd15776fa2841cd46dcda5778b816b8abe148440d18" gracePeriod=30 Feb 17 13:08:27.359626 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.359590 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:08:27.360091 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.359888 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="registry" containerID="cri-o://e5ec9e7f4eb2c06971a123b73e29feb975ec06470fc3a5e60038564090be12ff" gracePeriod=30 Feb 17 13:08:27.360091 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.359949 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" containerID="cri-o://a1e69d039c04df8cb5de21595639a4da6c9ac24c7ef862d5d77e8eb099179221" gracePeriod=30 Feb 17 13:08:27.360091 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.359985 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" containerID="cri-o://c8a67fa19fc715902dda4b1773e68cb5465f0c96dbb40ea2888bc44115a56811" gracePeriod=30 Feb 17 13:08:27.360375 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:27.359950 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="offline" containerID="cri-o://d03768321ab116e17d359f8eb929c45cfb8b9e07983a6a41ddf9efd814c8b943" gracePeriod=30 Feb 17 13:08:28.315418 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.315300 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7529458-25c0-441b-a096-2d635eb40468" containerID="a1e69d039c04df8cb5de21595639a4da6c9ac24c7ef862d5d77e8eb099179221" exitCode=0 Feb 17 13:08:28.315418 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.315327 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7529458-25c0-441b-a096-2d635eb40468" containerID="c8a67fa19fc715902dda4b1773e68cb5465f0c96dbb40ea2888bc44115a56811" exitCode=0 Feb 17 13:08:28.315418 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.315371 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"a1e69d039c04df8cb5de21595639a4da6c9ac24c7ef862d5d77e8eb099179221"} Feb 17 13:08:28.315418 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.315403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"c8a67fa19fc715902dda4b1773e68cb5465f0c96dbb40ea2888bc44115a56811"} Feb 17 13:08:28.317388 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.317367 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerID="248c8148fca455e0a95cfdd15776fa2841cd46dcda5778b816b8abe148440d18" exitCode=0 Feb 17 13:08:28.317388 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.317383 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerID="318c9a8591ae4e498333cf1c20d772a414399ee2d8e248b561fce5666efb69c9" exitCode=0 Feb 17 13:08:28.317537 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.317441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerDied","Data":"248c8148fca455e0a95cfdd15776fa2841cd46dcda5778b816b8abe148440d18"} Feb 17 13:08:28.317537 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:28.317466 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerDied","Data":"318c9a8591ae4e498333cf1c20d772a414399ee2d8e248b561fce5666efb69c9"} Feb 17 13:08:33.614191 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:33.614153 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.39:8443: connect: connection refused" Feb 17 13:08:33.876759 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:33.876666 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" probeResult="failure" output="Get \"https://10.133.0.39:6567/health\": dial tcp 10.133.0.39:6567: connect: connection refused" Feb 17 13:08:35.991347 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:35.991294 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.40:8443: connect: connection refused" Feb 17 13:08:36.113339 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:36.113296 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" probeResult="failure" output="Get \"https://10.133.0.40:6567/health\": dial tcp 10.133.0.40:6567: connect: connection refused" Feb 17 13:08:43.613720 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:43.613667 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.39:8443: connect: connection refused" Feb 17 13:08:43.876862 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:43.876767 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" probeResult="failure" output="Get \"https://10.133.0.39:6567/health\": dial tcp 10.133.0.39:6567: connect: connection refused" Feb 17 13:08:45.990735 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:45.990688 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.40:8443: connect: connection refused" Feb 17 13:08:46.113940 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:46.113896 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" probeResult="failure" output="Get \"https://10.133.0.40:6567/health\": dial tcp 10.133.0.40:6567: connect: connection refused" Feb 17 13:08:53.613964 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:53.613909 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.39:8443: connect: connection refused" Feb 17 13:08:53.614447 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:53.614057 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:08:53.876883 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:53.876796 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" probeResult="failure" output="Get \"https://10.133.0.39:6567/health\": dial tcp 10.133.0.39:6567: connect: connection refused" Feb 17 13:08:53.877031 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:53.876920 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:08:55.991424 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:55.991383 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" probeResult="failure" output="dial tcp 10.133.0.40:8443: connect: connection refused" Feb 17 13:08:55.991846 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:55.991506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:08:56.113252 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:56.113200 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" probeResult="failure" output="Get \"https://10.133.0.40:6567/health\": dial tcp 10.133.0.40:6567: connect: connection refused" Feb 17 13:08:56.113439 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:56.113362 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:08:57.413246 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.413215 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7529458-25c0-441b-a096-2d635eb40468" containerID="d03768321ab116e17d359f8eb929c45cfb8b9e07983a6a41ddf9efd814c8b943" exitCode=137 Feb 17 13:08:57.413246 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.413240 2573 generic.go:358] "Generic (PLEG): container finished" podID="e7529458-25c0-441b-a096-2d635eb40468" containerID="e5ec9e7f4eb2c06971a123b73e29feb975ec06470fc3a5e60038564090be12ff" exitCode=137 Feb 17 13:08:57.413708 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.413276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"d03768321ab116e17d359f8eb929c45cfb8b9e07983a6a41ddf9efd814c8b943"} Feb 17 13:08:57.413708 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.413313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"e5ec9e7f4eb2c06971a123b73e29feb975ec06470fc3a5e60038564090be12ff"} Feb 17 13:08:57.415471 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.415447 2573 generic.go:358] "Generic (PLEG): container finished" podID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerID="378beabb9490dd768cb9158277872170ea79e2bd6c4b3c4f94eefe7587eab971" exitCode=137 Feb 17 13:08:57.415581 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.415525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerDied","Data":"378beabb9490dd768cb9158277872170ea79e2bd6c4b3c4f94eefe7587eab971"} Feb 17 13:08:57.871051 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.871027 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:08:57.893298 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893267 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.893465 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.893465 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893346 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.893465 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893408 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.893616 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893477 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.893616 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.893519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbz6j\" (UniqueName: \"kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j\") pod \"a7cad315-3a4c-4de8-8af4-dc988b205c25\" (UID: \"a7cad315-3a4c-4de8-8af4-dc988b205c25\") " Feb 17 13:08:57.894085 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.894039 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Feb 17 13:08:57.894289 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.894250 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data" (OuterVolumeSpecName: "feast-data") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:08:57.895844 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.895801 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:57.895844 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.895829 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls" (OuterVolumeSpecName: "online-tls") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:57.896322 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.896297 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j" (OuterVolumeSpecName: "kube-api-access-wbz6j") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "kube-api-access-wbz6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:08:57.896529 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.896495 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "a7cad315-3a4c-4de8-8af4-dc988b205c25" (UID: "a7cad315-3a4c-4de8-8af4-dc988b205c25"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.994965 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/configmap/a7cad315-3a4c-4de8-8af4-dc988b205c25-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.995003 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbz6j\" (UniqueName: \"kubernetes.io/projected/a7cad315-3a4c-4de8-8af4-dc988b205c25-kube-api-access-wbz6j\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.995019 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.995033 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.995046 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/a7cad315-3a4c-4de8-8af4-dc988b205c25-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:57.995073 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:57.995057 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/a7cad315-3a4c-4de8-8af4-dc988b205c25-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.003431 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.003410 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:08:58.095872 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.095835 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.095872 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.095872 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.096158 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.095915 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.096158 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.095949 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl587\" (UniqueName: \"kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.096158 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.095987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.096158 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.096031 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") pod \"e7529458-25c0-441b-a096-2d635eb40468\" (UID: \"e7529458-25c0-441b-a096-2d635eb40468\") " Feb 17 13:08:58.096561 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.096529 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data" (OuterVolumeSpecName: "feast-data") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "feast-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:08:58.098327 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.098298 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls" (OuterVolumeSpecName: "online-tls") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "online-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:58.098432 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.098359 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587" (OuterVolumeSpecName: "kube-api-access-xl587") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "kube-api-access-xl587". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:08:58.098432 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.098394 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls" (OuterVolumeSpecName: "offline-tls") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "offline-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:58.098564 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.098546 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:58.098619 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.098561 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls" (OuterVolumeSpecName: "ui-tls") pod "e7529458-25c0-441b-a096-2d635eb40468" (UID: "e7529458-25c0-441b-a096-2d635eb40468"). InnerVolumeSpecName "ui-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Feb 17 13:08:58.196723 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196682 2573 reconciler_common.go:299] "Volume detached for volume \"ui-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-ui-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.196723 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196714 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl587\" (UniqueName: \"kubernetes.io/projected/e7529458-25c0-441b-a096-2d635eb40468-kube-api-access-xl587\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.196723 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196726 2573 reconciler_common.go:299] "Volume detached for volume \"online-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-online-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.196723 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196737 2573 reconciler_common.go:299] "Volume detached for volume \"offline-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-offline-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.197012 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196747 2573 reconciler_common.go:299] "Volume detached for volume \"feast-data\" (UniqueName: \"kubernetes.io/empty-dir/e7529458-25c0-441b-a096-2d635eb40468-feast-data\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.197012 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.196755 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/secret/e7529458-25c0-441b-a096-2d635eb40468-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:08:58.421800 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.421761 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" event={"ID":"e7529458-25c0-441b-a096-2d635eb40468","Type":"ContainerDied","Data":"0e79b435d4965fa63c828e5bdb4d4e1245ec17c98415296c267776d0002775c4"} Feb 17 13:08:58.421800 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.421788 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq" Feb 17 13:08:58.422317 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.421820 2573 scope.go:117] "RemoveContainer" containerID="a1e69d039c04df8cb5de21595639a4da6c9ac24c7ef862d5d77e8eb099179221" Feb 17 13:08:58.423955 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.423933 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" Feb 17 13:08:58.424063 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.423962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76" event={"ID":"a7cad315-3a4c-4de8-8af4-dc988b205c25","Type":"ContainerDied","Data":"98ffddaf7c85ff924eea6c6b3fb5fe3ac20810a1337de767cb2ecb846a3393a6"} Feb 17 13:08:58.431777 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.431760 2573 scope.go:117] "RemoveContainer" containerID="d03768321ab116e17d359f8eb929c45cfb8b9e07983a6a41ddf9efd814c8b943" Feb 17 13:08:58.439640 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.439620 2573 scope.go:117] "RemoveContainer" containerID="c8a67fa19fc715902dda4b1773e68cb5465f0c96dbb40ea2888bc44115a56811" Feb 17 13:08:58.443622 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.443598 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:08:58.448427 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.448402 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-feast/feast-simple-feast-setup-565c46b746-czfbq"] Feb 17 13:08:58.449222 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.449200 2573 scope.go:117] "RemoveContainer" containerID="e5ec9e7f4eb2c06971a123b73e29feb975ec06470fc3a5e60038564090be12ff" Feb 17 13:08:58.456984 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.456939 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:08:58.457510 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.457498 2573 scope.go:117] "RemoveContainer" containerID="c74cff4f86636e795147d255e8684e911a62a570dba0358e5c8aba56e611cc1c" Feb 17 13:08:58.462429 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.462406 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-remote-registry/feast-simple-feast-remote-setup-fbb6bb857-vsb76"] Feb 17 13:08:58.470121 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.470086 2573 scope.go:117] "RemoveContainer" containerID="248c8148fca455e0a95cfdd15776fa2841cd46dcda5778b816b8abe148440d18" Feb 17 13:08:58.478243 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.478222 2573 scope.go:117] "RemoveContainer" containerID="378beabb9490dd768cb9158277872170ea79e2bd6c4b3c4f94eefe7587eab971" Feb 17 13:08:58.485720 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.485697 2573 scope.go:117] "RemoveContainer" containerID="318c9a8591ae4e498333cf1c20d772a414399ee2d8e248b561fce5666efb69c9" Feb 17 13:08:58.493230 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.493215 2573 scope.go:117] "RemoveContainer" containerID="8afcd6da019246b57b7241ce18ef36348b8a05c5385c6af6a56fe0196582aa23" Feb 17 13:08:58.934322 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.934292 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" path="/var/lib/kubelet/pods/a7cad315-3a4c-4de8-8af4-dc988b205c25/volumes" Feb 17 13:08:58.934802 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:08:58.934789 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7529458-25c0-441b-a096-2d635eb40468" path="/var/lib/kubelet/pods/e7529458-25c0-441b-a096-2d635eb40468/volumes" Feb 17 13:09:15.899225 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899190 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlbvm/must-gather-stwrx"] Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899532 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899544 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899553 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="registry" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899559 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="registry" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899565 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="offline" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899571 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="offline" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899580 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="feast-init" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899585 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="feast-init" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899592 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899598 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899604 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="offline" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899609 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="offline" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899614 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899619 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899627 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="feast-init" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899632 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="feast-init" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899639 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899644 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899693 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="offline" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899700 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="ui" Feb 17 13:09:15.899703 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899709 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="ui" Feb 17 13:09:15.900467 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899717 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="registry" Feb 17 13:09:15.900467 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899723 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="offline" Feb 17 13:09:15.900467 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899731 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7529458-25c0-441b-a096-2d635eb40468" containerName="online" Feb 17 13:09:15.900467 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.899737 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7cad315-3a4c-4de8-8af4-dc988b205c25" containerName="online" Feb 17 13:09:15.902907 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.902878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:15.905529 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.905504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hlbvm\"/\"default-dockercfg-pb6l2\"" Feb 17 13:09:15.905529 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.905522 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hlbvm\"/\"openshift-service-ca.crt\"" Feb 17 13:09:15.906816 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.906796 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hlbvm\"/\"kube-root-ca.crt\"" Feb 17 13:09:15.908716 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.908693 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlbvm/must-gather-stwrx"] Feb 17 13:09:15.946507 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.946472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:15.946717 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:15.946593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp26k\" (UniqueName: \"kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.047812 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.047767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.048022 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.047864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp26k\" (UniqueName: \"kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.048216 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.048190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.055718 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.055686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp26k\" (UniqueName: \"kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k\") pod \"must-gather-stwrx\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.221767 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.221671 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:16.338645 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.338620 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlbvm/must-gather-stwrx"] Feb 17 13:09:16.340590 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:09:16.340558 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c170bc_3caa_4f86_bbe7_d572d4e280a0.slice/crio-a5650f68b3c62be92a97e5eaf39409112bb5984ce70e6710206ac089542f9e7d WatchSource:0}: Error finding container a5650f68b3c62be92a97e5eaf39409112bb5984ce70e6710206ac089542f9e7d: Status 404 returned error can't find the container with id a5650f68b3c62be92a97e5eaf39409112bb5984ce70e6710206ac089542f9e7d Feb 17 13:09:16.480427 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:16.480342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlbvm/must-gather-stwrx" event={"ID":"f9c170bc-3caa-4f86-bbe7-d572d4e280a0","Type":"ContainerStarted","Data":"a5650f68b3c62be92a97e5eaf39409112bb5984ce70e6710206ac089542f9e7d"} Feb 17 13:09:21.497290 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:21.497251 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlbvm/must-gather-stwrx" event={"ID":"f9c170bc-3caa-4f86-bbe7-d572d4e280a0","Type":"ContainerStarted","Data":"4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6"} Feb 17 13:09:21.497290 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:21.497292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlbvm/must-gather-stwrx" event={"ID":"f9c170bc-3caa-4f86-bbe7-d572d4e280a0","Type":"ContainerStarted","Data":"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae"} Feb 17 13:09:21.512168 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:21.512098 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlbvm/must-gather-stwrx" podStartSLOduration=2.20192937 podStartE2EDuration="6.512083335s" podCreationTimestamp="2026-02-17 13:09:15 +0000 UTC" firstStartedPulling="2026-02-17 13:09:16.342282399 +0000 UTC m=+1379.937097202" lastFinishedPulling="2026-02-17 13:09:20.652436355 +0000 UTC m=+1384.247251167" observedRunningTime="2026-02-17 13:09:21.511102462 +0000 UTC m=+1385.105917288" watchObservedRunningTime="2026-02-17 13:09:21.512083335 +0000 UTC m=+1385.106898160" Feb 17 13:09:29.523738 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:29.523702 2573 generic.go:358] "Generic (PLEG): container finished" podID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerID="cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae" exitCode=0 Feb 17 13:09:29.524182 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:29.523775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlbvm/must-gather-stwrx" event={"ID":"f9c170bc-3caa-4f86-bbe7-d572d4e280a0","Type":"ContainerDied","Data":"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae"} Feb 17 13:09:29.524182 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:29.524095 2573 scope.go:117] "RemoveContainer" containerID="cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae" Feb 17 13:09:29.559465 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:29.559436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlbvm_must-gather-stwrx_f9c170bc-3caa-4f86-bbe7-d572d4e280a0/gather/0.log" Feb 17 13:09:32.683238 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:32.683208 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6d7mm_ba4af195-0270-4e87-a0fe-8e7fdd18175d/global-pull-secret-syncer/0.log" Feb 17 13:09:32.887283 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:32.887251 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pfxld_93f0a968-7f6f-420d-9fb5-baf856136755/konnectivity-agent/0.log" Feb 17 13:09:32.932657 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:32.932631 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-216.ec2.internal_e80efaa3ce4d23555689acf7418c8107/haproxy/0.log" Feb 17 13:09:34.865295 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:34.865248 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlbvm/must-gather-stwrx"] Feb 17 13:09:34.865798 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:34.865549 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hlbvm/must-gather-stwrx" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="copy" containerID="cri-o://4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6" gracePeriod=2 Feb 17 13:09:34.867133 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:34.867094 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlbvm/must-gather-stwrx"] Feb 17 13:09:34.867716 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:34.867688 2573 status_manager.go:895] "Failed to get status for pod" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" pod="openshift-must-gather-hlbvm/must-gather-stwrx" err="pods \"must-gather-stwrx\" is forbidden: User \"system:node:ip-10-0-131-216.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hlbvm\": no relationship found between node 'ip-10-0-131-216.ec2.internal' and this object" Feb 17 13:09:35.105219 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.105198 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlbvm_must-gather-stwrx_f9c170bc-3caa-4f86-bbe7-d572d4e280a0/copy/0.log" Feb 17 13:09:35.105543 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.105528 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:35.225262 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.225229 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp26k\" (UniqueName: \"kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k\") pod \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " Feb 17 13:09:35.225432 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.225334 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output\") pod \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\" (UID: \"f9c170bc-3caa-4f86-bbe7-d572d4e280a0\") " Feb 17 13:09:35.225721 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.225700 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f9c170bc-3caa-4f86-bbe7-d572d4e280a0" (UID: "f9c170bc-3caa-4f86-bbe7-d572d4e280a0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Feb 17 13:09:35.227533 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.227503 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k" (OuterVolumeSpecName: "kube-api-access-fp26k") pod "f9c170bc-3caa-4f86-bbe7-d572d4e280a0" (UID: "f9c170bc-3caa-4f86-bbe7-d572d4e280a0"). InnerVolumeSpecName "kube-api-access-fp26k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Feb 17 13:09:35.326006 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.325969 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-must-gather-output\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:09:35.326006 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.325998 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fp26k\" (UniqueName: \"kubernetes.io/projected/f9c170bc-3caa-4f86-bbe7-d572d4e280a0-kube-api-access-fp26k\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Feb 17 13:09:35.542676 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.542593 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlbvm_must-gather-stwrx_f9c170bc-3caa-4f86-bbe7-d572d4e280a0/copy/0.log" Feb 17 13:09:35.542986 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.542957 2573 generic.go:358] "Generic (PLEG): container finished" podID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerID="4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6" exitCode=143 Feb 17 13:09:35.543142 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.543006 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlbvm/must-gather-stwrx" Feb 17 13:09:35.543142 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.543062 2573 scope.go:117] "RemoveContainer" containerID="4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6" Feb 17 13:09:35.551874 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.551849 2573 scope.go:117] "RemoveContainer" containerID="cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae" Feb 17 13:09:35.585940 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.585912 2573 scope.go:117] "RemoveContainer" containerID="4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6" Feb 17 13:09:35.586301 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:09:35.586280 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6\": container with ID starting with 4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6 not found: ID does not exist" containerID="4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6" Feb 17 13:09:35.586378 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.586311 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6"} err="failed to get container status \"4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6\": rpc error: code = NotFound desc = could not find container \"4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6\": container with ID starting with 4664f68009f91288af8500b8148dfd3d9ea0f5542bf4af7c213434d8cd0b98f6 not found: ID does not exist" Feb 17 13:09:35.586378 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.586332 2573 scope.go:117] "RemoveContainer" containerID="cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae" Feb 17 13:09:35.586606 ip-10-0-131-216 kubenswrapper[2573]: E0217 13:09:35.586586 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae\": container with ID starting with cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae not found: ID does not exist" containerID="cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae" Feb 17 13:09:35.586648 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.586616 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae"} err="failed to get container status \"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae\": rpc error: code = NotFound desc = could not find container \"cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae\": container with ID starting with cd6fdd65d8a1236ac282cc6dd97a3bb68107950044b045f7f58111e735d783ae not found: ID does not exist" Feb 17 13:09:35.834605 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.834519 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-565c7d9656-rhxxs_a8f510bc-d548-48d5-88df-9aad16d1fee4/cluster-monitoring-operator/0.log" Feb 17 13:09:35.862132 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.862066 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-77b75dc9f9-q67mp_39f8d1c9-c3cb-4a8a-a78d-a715c0f92754/kube-state-metrics/0.log" Feb 17 13:09:35.892481 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.892447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-77b75dc9f9-q67mp_39f8d1c9-c3cb-4a8a-a78d-a715c0f92754/kube-rbac-proxy-main/0.log" Feb 17 13:09:35.913741 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:35.913719 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-77b75dc9f9-q67mp_39f8d1c9-c3cb-4a8a-a78d-a715c0f92754/kube-rbac-proxy-self/0.log" Feb 17 13:09:36.186662 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.186637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdptk_aae180f1-f47e-481b-877d-af97cf7e7caa/node-exporter/0.log" Feb 17 13:09:36.206649 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.206614 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdptk_aae180f1-f47e-481b-877d-af97cf7e7caa/kube-rbac-proxy/0.log" Feb 17 13:09:36.229096 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.229077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdptk_aae180f1-f47e-481b-877d-af97cf7e7caa/init-textfile/0.log" Feb 17 13:09:36.256482 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.256458 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6b69bf8d6b-74lsq_ea77fa6e-f053-4051-ab3d-c52ead601a19/kube-rbac-proxy-main/0.log" Feb 17 13:09:36.276563 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.276535 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6b69bf8d6b-74lsq_ea77fa6e-f053-4051-ab3d-c52ead601a19/kube-rbac-proxy-self/0.log" Feb 17 13:09:36.298317 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.298298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6b69bf8d6b-74lsq_ea77fa6e-f053-4051-ab3d-c52ead601a19/openshift-state-metrics/0.log" Feb 17 13:09:36.593136 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.593039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75748fc6cd-bwhxw_21ba913d-e137-4041-8efc-9da24c250805/telemeter-client/0.log" Feb 17 13:09:36.616028 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.616001 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75748fc6cd-bwhxw_21ba913d-e137-4041-8efc-9da24c250805/reload/0.log" Feb 17 13:09:36.635980 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.635951 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75748fc6cd-bwhxw_21ba913d-e137-4041-8efc-9da24c250805/kube-rbac-proxy/0.log" Feb 17 13:09:36.666906 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.666865 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/thanos-query/0.log" Feb 17 13:09:36.686909 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.686870 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/kube-rbac-proxy-web/0.log" Feb 17 13:09:36.707205 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.707179 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/kube-rbac-proxy/0.log" Feb 17 13:09:36.730687 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.730663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/prom-label-proxy/0.log" Feb 17 13:09:36.751939 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.751916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/kube-rbac-proxy-rules/0.log" Feb 17 13:09:36.774351 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.774312 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ff677b5f6-qjb5l_fe0c3e07-4e2f-4ec8-a097-8e723c5fcf36/kube-rbac-proxy-metrics/0.log" Feb 17 13:09:36.933621 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:36.933586 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" path="/var/lib/kubelet/pods/f9c170bc-3caa-4f86-bbe7-d572d4e280a0/volumes" Feb 17 13:09:38.288991 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:38.288960 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/2.log" Feb 17 13:09:38.295555 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:38.295529 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5744d8689c-4b6mv_276ac3fc-41f7-4f46-8cd1-e26a91986d96/console-operator/3.log" Feb 17 13:09:38.630582 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:38.630505 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56554b9687-q8sz2_ae8b475f-3383-4364-9d53-9f68d6cf3f63/console/0.log" Feb 17 13:09:38.659138 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:38.659076 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-58b949d66d-pn5bm_056c76cf-5dfe-4900-898b-551996c808b0/download-server/0.log" Feb 17 13:09:39.002977 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:38.999472 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-56b878674-sddpm_a75194e0-0c8c-4b2e-9d40-9622476fe327/volume-data-source-validator/0.log" Feb 17 13:09:39.472354 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472317 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg"] Feb 17 13:09:39.472740 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472682 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="copy" Feb 17 13:09:39.472740 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472693 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="copy" Feb 17 13:09:39.472740 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472711 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="gather" Feb 17 13:09:39.472740 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472717 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="gather" Feb 17 13:09:39.472885 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472771 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="copy" Feb 17 13:09:39.472885 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.472780 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c170bc-3caa-4f86-bbe7-d572d4e280a0" containerName="gather" Feb 17 13:09:39.475726 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.475709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.478074 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.478051 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7x4hp\"/\"default-dockercfg-vnmgq\"" Feb 17 13:09:39.478234 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.478052 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7x4hp\"/\"openshift-service-ca.crt\"" Feb 17 13:09:39.479291 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.479277 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7x4hp\"/\"kube-root-ca.crt\"" Feb 17 13:09:39.484235 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.484214 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg"] Feb 17 13:09:39.657732 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.657706 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h27xf_d63493ac-401c-46c9-8e2d-344b22008d74/dns/0.log" Feb 17 13:09:39.660961 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.660939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-proc\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.661043 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.660975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-lib-modules\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.661043 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.661030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-sys\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.661138 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.661057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvv4r\" (UniqueName: \"kubernetes.io/projected/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-kube-api-access-gvv4r\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.661186 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.661140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-podres\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.677439 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.677417 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h27xf_d63493ac-401c-46c9-8e2d-344b22008d74/kube-rbac-proxy/0.log" Feb 17 13:09:39.743135 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.743032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4jqbk_13fc6c26-7ed3-4ea9-9c4f-4317cdd2de55/dns-node-resolver/0.log" Feb 17 13:09:39.761507 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-sys\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761660 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvv4r\" (UniqueName: \"kubernetes.io/projected/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-kube-api-access-gvv4r\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761660 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-sys\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761660 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-podres\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-proc\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-lib-modules\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-podres\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.761841 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-proc\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.762017 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.761936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-lib-modules\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.768935 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.768914 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvv4r\" (UniqueName: \"kubernetes.io/projected/a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b-kube-api-access-gvv4r\") pod \"perf-node-gather-daemonset-4nckg\" (UID: \"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b\") " pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.785960 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.785933 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:39.903008 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.902971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg"] Feb 17 13:09:39.906024 ip-10-0-131-216 kubenswrapper[2573]: W0217 13:09:39.906001 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda756ca2c_7867_4d4b_bd2f_0e7a81d0cc8b.slice/crio-7f32f69f0e530309caaa064d548cb243d9d18a9f7b1681cb71ac2d86bc9890fe WatchSource:0}: Error finding container 7f32f69f0e530309caaa064d548cb243d9d18a9f7b1681cb71ac2d86bc9890fe: Status 404 returned error can't find the container with id 7f32f69f0e530309caaa064d548cb243d9d18a9f7b1681cb71ac2d86bc9890fe Feb 17 13:09:39.907599 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:39.907577 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:09:40.197305 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.197276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-55d85f6897-jnlnq_e37820d2-72f5-4937-b1bf-d5f9263bc97c/registry/0.log" Feb 17 13:09:40.238312 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.238283 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mdbbf_8ee47699-3923-4434-9f20-86ebd9785b9f/node-ca/0.log" Feb 17 13:09:40.559227 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.559195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" event={"ID":"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b","Type":"ContainerStarted","Data":"f159e0a318c9eb9bc8e4b5b167fc847a511d27cf01c6832db37e7b63417bd049"} Feb 17 13:09:40.559227 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.559230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" event={"ID":"a756ca2c-7867-4d4b-bd2f-0e7a81d0cc8b","Type":"ContainerStarted","Data":"7f32f69f0e530309caaa064d548cb243d9d18a9f7b1681cb71ac2d86bc9890fe"} Feb 17 13:09:40.559752 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.559326 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:40.575483 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:40.575441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" podStartSLOduration=1.5754278670000001 podStartE2EDuration="1.575427867s" podCreationTimestamp="2026-02-17 13:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:09:40.573724541 +0000 UTC m=+1404.168539368" watchObservedRunningTime="2026-02-17 13:09:40.575427867 +0000 UTC m=+1404.170242745" Feb 17 13:09:41.186769 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.186743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6q7rb_b0b91144-3ba6-4290-8174-1c2bdc3ca3d1/serve-healthcheck-canary/0.log" Feb 17 13:09:41.518018 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.517932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5d56856ff5-ctf9v_1a8cc667-aa21-4c52-810c-330a53bdcfd3/insights-operator/0.log" Feb 17 13:09:41.518237 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.518215 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5d56856ff5-ctf9v_1a8cc667-aa21-4c52-810c-330a53bdcfd3/insights-operator/1.log" Feb 17 13:09:41.674067 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.674038 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wwx6c_0950e3dd-d44c-43e4-a432-70d036ed1820/kube-rbac-proxy/0.log" Feb 17 13:09:41.693170 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.693149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wwx6c_0950e3dd-d44c-43e4-a432-70d036ed1820/exporter/0.log" Feb 17 13:09:41.713364 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:41.713344 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wwx6c_0950e3dd-d44c-43e4-a432-70d036ed1820/extractor/0.log" Feb 17 13:09:45.767066 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:45.767032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs_d1ead7d2-81f9-4afa-8d87-188a741e9848/kube-storage-version-migrator-operator/1.log" Feb 17 13:09:45.768773 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:45.768737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-55bf9dc6f6-q7mrs_d1ead7d2-81f9-4afa-8d87-188a741e9848/kube-storage-version-migrator-operator/0.log" Feb 17 13:09:46.497128 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.497093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/kube-multus-additional-cni-plugins/0.log" Feb 17 13:09:46.518232 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.518192 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/egress-router-binary-copy/0.log" Feb 17 13:09:46.537344 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.537319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/cni-plugins/0.log" Feb 17 13:09:46.556648 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.556627 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/bond-cni-plugin/0.log" Feb 17 13:09:46.572756 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.572736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7x4hp/perf-node-gather-daemonset-4nckg" Feb 17 13:09:46.577179 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.577161 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/routeoverride-cni/0.log" Feb 17 13:09:46.598739 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.598717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/whereabouts-cni-bincopy/0.log" Feb 17 13:09:46.618141 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:46.618121 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jlcw_bb54e080-0e5a-47e9-bb34-5749143aff6e/whereabouts-cni/0.log" Feb 17 13:09:47.008440 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.008409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ttlg5_a8056817-5e72-49a7-accb-32ae96f50dcb/kube-multus/0.log" Feb 17 13:09:47.029095 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.029075 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cnhns_ad710990-167a-49aa-bad8-faa970a4c3bb/network-metrics-daemon/0.log" Feb 17 13:09:47.046759 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.046740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cnhns_ad710990-167a-49aa-bad8-faa970a4c3bb/kube-rbac-proxy/0.log" Feb 17 13:09:47.844567 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.844531 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-controller/0.log" Feb 17 13:09:47.860876 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.860854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/0.log" Feb 17 13:09:47.874172 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.874151 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovn-acl-logging/1.log" Feb 17 13:09:47.895853 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.895833 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/kube-rbac-proxy-node/0.log" Feb 17 13:09:47.916967 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.916941 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/kube-rbac-proxy-ovn-metrics/0.log" Feb 17 13:09:47.934528 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.934500 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/northd/0.log" Feb 17 13:09:47.954217 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.954190 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/nbdb/0.log" Feb 17 13:09:47.974937 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:47.974915 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/sbdb/0.log" Feb 17 13:09:48.204472 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:48.204440 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-494bm_d39928a0-1a0f-4b0b-b327-943d7c48930d/ovnkube-controller/0.log" Feb 17 13:09:49.716161 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:49.716134 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-5f8c4fff5b-fk4w7_af5d6011-8448-486c-8483-99cdd3870524/check-endpoints/0.log" Feb 17 13:09:49.765683 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:49.765648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kncvl_52127944-2f75-482d-bab6-3694ac75b66a/network-check-target-container/0.log" Feb 17 13:09:50.735601 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:50.735574 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vhlqf_3daec06e-ea34-4fc8-9592-ac5ec216491e/iptables-alerter/0.log" Feb 17 13:09:51.302804 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:51.302779 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jb42s_f4d28204-67cd-4aef-b69e-07d8309c6436/tuned/0.log" Feb 17 13:09:53.828096 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:53.828065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-ffd9f846b-scl5h_c2d6500e-0397-48cd-bf45-464b40e47782/service-ca-operator/1.log" Feb 17 13:09:53.829745 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:53.829720 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-ffd9f846b-scl5h_c2d6500e-0397-48cd-bf45-464b40e47782/service-ca-operator/0.log" Feb 17 13:09:54.098680 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:54.098604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-8495d7c844-5jhjj_ecd00037-caa6-488d-8d6d-2b228d11821f/service-ca-controller/0.log" Feb 17 13:09:54.547371 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:54.547341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tzn4r_a17ee1be-195d-4b7e-8690-072cd431deef/csi-driver/0.log" Feb 17 13:09:54.567304 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:54.567278 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tzn4r_a17ee1be-195d-4b7e-8690-072cd431deef/csi-node-driver-registrar/0.log" Feb 17 13:09:54.588074 ip-10-0-131-216 kubenswrapper[2573]: I0217 13:09:54.588039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tzn4r_a17ee1be-195d-4b7e-8690-072cd431deef/csi-liveness-probe/0.log"